Dec 03 12:14:40 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 12:14:40 crc restorecon[4537]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:40 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:14:41 crc restorecon[4537]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 12:14:41 crc kubenswrapper[4711]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 12:14:41 crc kubenswrapper[4711]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 12:14:41 crc kubenswrapper[4711]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 12:14:41 crc kubenswrapper[4711]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 12:14:41 crc kubenswrapper[4711]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 12:14:41 crc kubenswrapper[4711]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.632953 4711 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639362 4711 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639395 4711 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639405 4711 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639417 4711 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639427 4711 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639436 4711 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639448 4711 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639460 4711 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639470 4711 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639480 4711 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639489 4711 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639498 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639507 4711 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639516 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639525 4711 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639535 4711 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639544 4711 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639553 4711 feature_gate.go:330] unrecognized feature gate: Example Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639565 4711 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639574 4711 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639583 4711 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639592 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639601 4711 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639609 4711 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639618 4711 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639626 4711 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639635 4711 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639643 4711 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639652 4711 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639660 4711 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639671 4711 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639683 4711 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639694 4711 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639705 4711 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639717 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639726 4711 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639736 4711 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639746 4711 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639755 4711 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639764 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639773 4711 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639781 4711 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639790 4711 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639798 4711 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639806 4711 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639815 4711 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639828 4711 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639838 4711 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639846 4711 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639856 4711 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639864 4711 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639875 4711 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639884 4711 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639893 4711 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639902 4711 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639935 4711 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639944 4711 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639953 4711 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639961 4711 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639969 4711 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639978 4711 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639986 4711 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.639995 4711 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.640003 4711 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.640011 4711 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.640022 4711 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.640031 4711 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.640040 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.640049 4711 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.640058 4711 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.640066 4711 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640225 4711 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640242 4711 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640259 4711 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640271 4711 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640283 4711 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640293 4711 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640305 4711 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640317 4711 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640328 4711 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640338 4711 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640348 4711 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640358 4711 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640369 4711 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640379 4711 flags.go:64] FLAG: --cgroup-root="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640388 4711 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640398 4711 flags.go:64] FLAG: --client-ca-file="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640408 4711 flags.go:64] FLAG: --cloud-config="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640417 4711 flags.go:64] FLAG: --cloud-provider="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640427 4711 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640439 4711 flags.go:64] FLAG: --cluster-domain="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640448 4711 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640459 4711 flags.go:64] FLAG: --config-dir="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640468 4711 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640479 4711 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640497 4711 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640507 4711 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640517 4711 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640527 4711 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640537 4711 flags.go:64] FLAG: --contention-profiling="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640547 4711 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640557 4711 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640567 4711 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640576 4711 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640588 4711 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640598 4711 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640607 4711 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640617 4711 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640627 4711 flags.go:64] FLAG: --enable-server="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640637 4711 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640649 4711 flags.go:64] FLAG: --event-burst="100" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640660 4711 flags.go:64] FLAG: --event-qps="50" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640670 4711 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640680 4711 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640690 4711 flags.go:64] FLAG: --eviction-hard="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640703 4711 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640713 4711 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640723 4711 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640733 4711 flags.go:64] FLAG: --eviction-soft="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640743 4711 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640752 4711 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640762 4711 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640771 4711 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640780 4711 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640790 4711 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640800 4711 flags.go:64] FLAG: --feature-gates="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640812 4711 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640822 4711 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640832 4711 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640842 4711 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640852 4711 flags.go:64] FLAG: --healthz-port="10248" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640862 4711 flags.go:64] FLAG: --help="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640872 4711 flags.go:64] FLAG: --hostname-override="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640882 4711 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640891 4711 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640901 4711 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640936 4711 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640946 4711 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640955 4711 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640965 4711 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640975 4711 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640984 4711 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.640994 4711 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641004 4711 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641014 4711 flags.go:64] FLAG: --kube-reserved="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641024 4711 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641034 4711 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641044 4711 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641053 4711 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641063 4711 flags.go:64] FLAG: --lock-file="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641072 4711 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641083 4711 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641093 4711 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641106 4711 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641116 4711 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641125 4711 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641136 4711 flags.go:64] FLAG: --logging-format="text" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641146 4711 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641156 4711 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641166 4711 flags.go:64] FLAG: --manifest-url="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641176 4711 flags.go:64] FLAG: --manifest-url-header="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641188 4711 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641199 4711 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641211 4711 flags.go:64] FLAG: --max-pods="110" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641220 4711 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641230 4711 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641241 4711 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641250 4711 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641260 4711 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641270 4711 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641280 4711 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641299 4711 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641309 4711 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641319 4711 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641329 4711 flags.go:64] FLAG: --pod-cidr="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641338 4711 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641351 4711 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641361 4711 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641371 4711 flags.go:64] FLAG: --pods-per-core="0" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641381 4711 flags.go:64] FLAG: --port="10250" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641391 4711 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641401 4711 flags.go:64] FLAG: --provider-id="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641413 4711 flags.go:64] FLAG: --qos-reserved="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641423 4711 flags.go:64] FLAG: --read-only-port="10255" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641433 4711 flags.go:64] FLAG: --register-node="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641443 4711 flags.go:64] FLAG: --register-schedulable="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641452 4711 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641468 4711 flags.go:64] FLAG: --registry-burst="10" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641479 4711 flags.go:64] FLAG: --registry-qps="5" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641488 4711 flags.go:64] FLAG: --reserved-cpus="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641498 4711 flags.go:64] FLAG: --reserved-memory="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641509 4711 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641519 4711 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641529 4711 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641539 4711 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641548 4711 flags.go:64] FLAG: --runonce="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641558 4711 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641568 4711 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641579 4711 flags.go:64] FLAG: --seccomp-default="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641589 4711 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641598 4711 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641608 4711 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641618 4711 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641628 4711 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641638 4711 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641648 4711 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641657 4711 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641667 4711 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641677 4711 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641687 4711 flags.go:64] FLAG: --system-cgroups="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641696 4711 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641711 4711 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641721 4711 flags.go:64] FLAG: --tls-cert-file="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641731 4711 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641742 4711 flags.go:64] FLAG: --tls-min-version="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641751 4711 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641761 4711 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641771 4711 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641780 4711 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641790 4711 flags.go:64] FLAG: --v="2" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641802 4711 flags.go:64] FLAG: --version="false" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641813 4711 flags.go:64] FLAG: --vmodule="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641824 4711 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.641834 4711 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642068 4711 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642080 4711 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642090 4711 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642099 4711 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642108 4711 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642117 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642127 4711 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642136 4711 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642145 4711 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642156 4711 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642168 4711 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642177 4711 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642185 4711 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642194 4711 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642202 4711 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642211 4711 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642220 4711 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642230 4711 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642238 4711 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642246 4711 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642256 4711 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642265 4711 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642273 4711 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642281 4711 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642289 4711 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642298 4711 feature_gate.go:330] unrecognized feature gate: Example Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642306 4711 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642315 4711 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642324 4711 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642332 4711 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642340 4711 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642349 4711 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642358 4711 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642366 4711 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642374 4711 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642383 4711 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642392 4711 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642401 4711 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642409 4711 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642418 4711 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642426 4711 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642434 4711 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642443 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642451 4711 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642460 4711 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642468 4711 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642477 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642486 4711 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642494 4711 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642505 4711 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642517 4711 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642551 4711 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642572 4711 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642582 4711 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642592 4711 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642601 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642610 4711 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642619 4711 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642629 4711 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642638 4711 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642650 4711 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642662 4711 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642672 4711 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642682 4711 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642692 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642704 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642716 4711 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642727 4711 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642738 4711 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642749 4711 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.642760 4711 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.643031 4711 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.655798 4711 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.655847 4711 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656025 4711 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656050 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656060 4711 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656070 4711 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656079 4711 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656088 4711 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656098 4711 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656106 4711 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656115 4711 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656123 4711 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656132 4711 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656141 4711 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656150 4711 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656158 4711 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656167 4711 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656175 4711 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656184 4711 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656193 4711 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656202 4711 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656210 4711 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656219 4711 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656227 4711 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656236 4711 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656245 4711 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656256 4711 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656272 4711 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656283 4711 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656293 4711 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656302 4711 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656313 4711 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656323 4711 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656335 4711 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656344 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656355 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656364 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656374 4711 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656384 4711 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656393 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656402 4711 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656412 4711 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656421 4711 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656430 4711 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656442 4711 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656451 4711 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656462 4711 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656472 4711 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656482 4711 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656491 4711 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656502 4711 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656511 4711 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656521 4711 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656530 4711 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656539 4711 feature_gate.go:330] unrecognized feature gate: Example Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656549 4711 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656558 4711 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656567 4711 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656576 4711 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656585 4711 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656593 4711 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656602 4711 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656611 4711 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656619 4711 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656628 4711 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656636 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656645 4711 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656653 4711 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656661 4711 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656670 4711 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656678 4711 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656687 4711 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656695 4711 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.656710 4711 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656976 4711 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.656995 4711 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657010 4711 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657027 4711 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657038 4711 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657049 4711 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657058 4711 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657067 4711 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657076 4711 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657085 4711 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657095 4711 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657104 4711 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657115 4711 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657126 4711 feature_gate.go:330] unrecognized feature gate: Example Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657136 4711 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657146 4711 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657155 4711 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657165 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657174 4711 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657183 4711 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657191 4711 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657201 4711 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657210 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657218 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657227 4711 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657235 4711 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657246 4711 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657256 4711 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657265 4711 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657274 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657282 4711 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657291 4711 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657300 4711 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657308 4711 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657317 4711 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657326 4711 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657335 4711 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657345 4711 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657354 4711 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657362 4711 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657371 4711 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657380 4711 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657388 4711 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657397 4711 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657405 4711 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657414 4711 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657422 4711 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657432 4711 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657441 4711 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657449 4711 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657458 4711 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657466 4711 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657475 4711 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657483 4711 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657492 4711 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657501 4711 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657509 4711 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657518 4711 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657527 4711 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657535 4711 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657543 4711 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657552 4711 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657560 4711 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657568 4711 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657577 4711 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657585 4711 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657593 4711 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657601 4711 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657613 4711 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657624 4711 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.657633 4711 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.657645 4711 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.658244 4711 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.663208 4711 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.663392 4711 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.664798 4711 server.go:997] "Starting client certificate rotation" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.664847 4711 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.665549 4711 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-22 06:46:44.321117834 +0000 UTC Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.665619 4711 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 450h32m2.655504044s for next certificate rotation Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.675456 4711 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.678535 4711 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.692256 4711 log.go:25] "Validated CRI v1 runtime API" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.720146 4711 log.go:25] "Validated CRI v1 image API" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.722130 4711 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.724816 4711 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-12-09-07-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.724842 4711 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:40 fsType:tmpfs blockSize:0}] Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.739607 4711 manager.go:217] Machine: {Timestamp:2025-12-03 12:14:41.737847903 +0000 UTC m=+0.407099178 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4ce16df9-f35c-45ca-ad31-f0686ce28357 BootID:6b4f6397-1a81-4c19-aa75-c22851b76849 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:40 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:43:72:b8 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:43:72:b8 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7f:75:f9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9d:1f:d6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:88:d8:3e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:89:a7:60 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b2:be:86:62:b5:a8 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:b8:16:03:81:22 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.739851 4711 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.740035 4711 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.741755 4711 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.741963 4711 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.741989 4711 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.742962 4711 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.742988 4711 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.743215 4711 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.743244 4711 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.743527 4711 state_mem.go:36] "Initialized new in-memory state store" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.743612 4711 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.744293 4711 kubelet.go:418] "Attempting to sync node with API server" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.744317 4711 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.744337 4711 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.744349 4711 kubelet.go:324] "Adding apiserver pod source" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.744359 4711 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.745978 4711 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.746557 4711 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748023 4711 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748572 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748598 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748606 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748614 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748625 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748632 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748639 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748654 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748662 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748669 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748679 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.748687 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.749311 4711 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.749608 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.749697 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.749830 4711 server.go:1280] "Started kubelet" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.750064 4711 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.750050 4711 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.750485 4711 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:41 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.751485 4711 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.752033 4711 server.go:460] "Adding debug handlers to kubelet server" Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.752225 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.752874 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.752967 4711 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.753319 4711 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 12:00:57.521064786 +0000 UTC Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.753619 4711 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.754054 4711 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.754096 4711 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.754298 4711 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.754680 4711 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.755363 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.755504 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.756566 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.756736 4711 factory.go:55] Registering systemd factory Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.756764 4711 factory.go:221] Registration of the systemd container factory successfully Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.756543 4711 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187db3950d10bebd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 12:14:41.749802685 +0000 UTC m=+0.419053940,LastTimestamp:2025-12-03 12:14:41.749802685 +0000 UTC m=+0.419053940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.757357 4711 factory.go:153] Registering CRI-O factory Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.757398 4711 factory.go:221] Registration of the crio container factory successfully Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.757565 4711 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.757658 4711 factory.go:103] Registering Raw factory Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.757705 4711 manager.go:1196] Started watching for new ooms in manager Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.759260 4711 manager.go:319] Starting recovery of all containers Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771524 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771579 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771593 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771602 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771613 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771623 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771648 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771659 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771671 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771680 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771690 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771704 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771713 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771725 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771737 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771747 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771757 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771767 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771778 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771790 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771801 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771812 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771853 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771864 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771875 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771883 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771894 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771919 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771930 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771939 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771950 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771960 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771971 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771982 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.771992 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772002 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772011 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772024 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772032 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772042 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772051 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772062 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772071 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772082 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772096 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772109 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772124 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772136 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772148 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772163 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772179 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772189 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772202 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772214 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772226 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772236 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772246 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772257 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772267 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772276 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772286 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772297 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772307 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772316 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772325 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772334 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772344 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772354 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772363 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772372 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772382 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772392 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772410 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772422 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772432 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772445 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772455 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772463 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772473 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772482 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772492 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772502 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772510 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772519 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772528 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772538 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772561 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772575 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772587 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772598 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772609 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772624 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772634 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.772643 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774346 4711 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774375 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774402 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774414 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774424 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774435 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774445 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774455 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774466 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774476 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774486 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774503 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774513 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774524 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774533 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774543 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774555 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774564 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774574 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774584 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774593 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774603 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774615 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774625 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774634 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774645 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774655 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774664 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774673 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774684 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774694 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774704 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774714 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774722 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774731 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774740 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774749 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774765 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774775 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774783 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774795 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774804 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774873 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774887 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774900 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774922 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774932 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774941 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774951 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774961 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774972 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774982 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.774992 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775001 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775012 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775020 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775029 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775039 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775050 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775061 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775070 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775080 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775088 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775099 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775110 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775118 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775128 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775137 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775148 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775157 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775167 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775177 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775186 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775196 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775205 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775214 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775224 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775234 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775244 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775253 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775263 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775272 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775282 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775291 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775300 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775309 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775325 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775335 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775348 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775358 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775366 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775376 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775384 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775393 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775403 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775411 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775421 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775429 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775437 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775446 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775464 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775472 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775480 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775488 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775500 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775509 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775520 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775531 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775540 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775550 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775560 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775569 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775580 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775589 4711 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775598 4711 reconstruct.go:97] "Volume reconstruction finished" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.775605 4711 reconciler.go:26] "Reconciler: start to sync state" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.787198 4711 manager.go:324] Recovery completed Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.800809 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.803123 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.803297 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.803427 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.804845 4711 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.804868 4711 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.804985 4711 state_mem.go:36] "Initialized new in-memory state store" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.813015 4711 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.815850 4711 policy_none.go:49] "None policy: Start" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.815902 4711 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.815980 4711 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.816022 4711 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.816137 4711 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 12:14:41 crc kubenswrapper[4711]: W1203 12:14:41.817008 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.817068 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.817480 4711 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.817504 4711 state_mem.go:35] "Initializing new in-memory state store" Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.855368 4711 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.879306 4711 manager.go:334] "Starting Device Plugin manager" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.879551 4711 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.879576 4711 server.go:79] "Starting device plugin registration server" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.879993 4711 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.880009 4711 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.881320 4711 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.881416 4711 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.881426 4711 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.892539 4711 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.916535 4711 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.916652 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.918314 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.918345 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.918357 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.918460 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.918856 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.919024 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.919200 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.919223 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.919234 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.919322 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.919501 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.919562 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.920252 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.920276 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.920286 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.920389 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.920568 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.920609 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921006 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921217 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921257 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921371 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921383 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921053 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921453 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921470 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921486 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921516 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921539 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921582 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921695 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921729 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.921346 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.922882 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.922941 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.922957 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.923355 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.923406 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.923440 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.923666 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.923717 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.924824 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.924865 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.924884 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.958312 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.978683 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.978737 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.978764 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.978804 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.978845 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.978900 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.978977 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.979004 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.979030 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.979269 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.979370 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.979404 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.979433 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.979462 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.979496 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.980089 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.981509 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.981563 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.981582 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4711]: I1203 12:14:41.981621 4711 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:14:41 crc kubenswrapper[4711]: E1203 12:14:41.982241 4711 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080547 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080644 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080692 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080739 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080783 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080824 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080870 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080892 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080965 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080982 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081022 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081034 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080993 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080818 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080894 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080969 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.080820 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081140 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081190 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081195 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081255 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081218 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081262 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081354 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081314 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081416 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081415 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081478 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081478 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.081634 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: E1203 12:14:42.105138 4711 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187db3950d10bebd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 12:14:41.749802685 +0000 UTC m=+0.419053940,LastTimestamp:2025-12-03 12:14:41.749802685 +0000 UTC m=+0.419053940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.183216 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.184850 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.184989 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.185009 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.185100 4711 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:14:42 crc kubenswrapper[4711]: E1203 12:14:42.185997 4711 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.249624 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.259195 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.279891 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: W1203 12:14:42.284729 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c8d464ee1d6659dae89884353671edae6b511ea4ecfaa95f0df4785115ef594f WatchSource:0}: Error finding container c8d464ee1d6659dae89884353671edae6b511ea4ecfaa95f0df4785115ef594f: Status 404 returned error can't find the container with id c8d464ee1d6659dae89884353671edae6b511ea4ecfaa95f0df4785115ef594f Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.288286 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: W1203 12:14:42.288784 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-332fe2999bb95e9a077f51eeffe2a355367c864d0de214268848a294d4a7a014 WatchSource:0}: Error finding container 332fe2999bb95e9a077f51eeffe2a355367c864d0de214268848a294d4a7a014: Status 404 returned error can't find the container with id 332fe2999bb95e9a077f51eeffe2a355367c864d0de214268848a294d4a7a014 Dec 03 12:14:42 crc kubenswrapper[4711]: W1203 12:14:42.302956 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-14fa20f5014d0221a53bd95c653cef4c2a11798334af9b3f39c4ad511aee0e19 WatchSource:0}: Error finding container 14fa20f5014d0221a53bd95c653cef4c2a11798334af9b3f39c4ad511aee0e19: Status 404 returned error can't find the container with id 14fa20f5014d0221a53bd95c653cef4c2a11798334af9b3f39c4ad511aee0e19 Dec 03 12:14:42 crc kubenswrapper[4711]: W1203 12:14:42.305489 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0b6205ba98256520c15ecdd69768dc17f80604b6bc1aa4fe8bbe4f3f009751c7 WatchSource:0}: Error finding container 0b6205ba98256520c15ecdd69768dc17f80604b6bc1aa4fe8bbe4f3f009751c7: Status 404 returned error can't find the container with id 0b6205ba98256520c15ecdd69768dc17f80604b6bc1aa4fe8bbe4f3f009751c7 Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.306960 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:42 crc kubenswrapper[4711]: W1203 12:14:42.336566 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e1d000f05dfb1e5cb6bbd88ee10b313fa525f591708636cc7233465523b350bb WatchSource:0}: Error finding container e1d000f05dfb1e5cb6bbd88ee10b313fa525f591708636cc7233465523b350bb: Status 404 returned error can't find the container with id e1d000f05dfb1e5cb6bbd88ee10b313fa525f591708636cc7233465523b350bb Dec 03 12:14:42 crc kubenswrapper[4711]: E1203 12:14:42.359034 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Dec 03 12:14:42 crc kubenswrapper[4711]: W1203 12:14:42.572017 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:42 crc kubenswrapper[4711]: E1203 12:14:42.572116 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.586441 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.589254 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.589293 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.589308 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.589336 4711 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:14:42 crc kubenswrapper[4711]: E1203 12:14:42.589752 4711 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 03 12:14:42 crc kubenswrapper[4711]: W1203 12:14:42.708086 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:42 crc kubenswrapper[4711]: E1203 12:14:42.708572 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.751846 4711 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.753925 4711 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 06:13:56.433018597 +0000 UTC Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.753977 4711 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 857h59m13.679044676s for next certificate rotation Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.821546 4711 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948" exitCode=0 Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.821625 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.821716 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0b6205ba98256520c15ecdd69768dc17f80604b6bc1aa4fe8bbe4f3f009751c7"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.821835 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.823020 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.823048 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.823056 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.823811 4711 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5" exitCode=0 Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.823832 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.823867 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"14fa20f5014d0221a53bd95c653cef4c2a11798334af9b3f39c4ad511aee0e19"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.823962 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.824516 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.824524 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.824537 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.824546 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.825366 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.825395 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.825403 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.826136 4711 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c14e6a215782f14a8b7c8dc6f6e06079dbadc32cd6c5020ce0c223b71c1199dc" exitCode=0 Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.826188 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c14e6a215782f14a8b7c8dc6f6e06079dbadc32cd6c5020ce0c223b71c1199dc"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.826206 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c8d464ee1d6659dae89884353671edae6b511ea4ecfaa95f0df4785115ef594f"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.826282 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.827108 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.827127 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.827135 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.828644 4711 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d" exitCode=0 Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.828699 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.828718 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"332fe2999bb95e9a077f51eeffe2a355367c864d0de214268848a294d4a7a014"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.828783 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.829629 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.829659 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.829671 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.830597 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c"} Dec 03 12:14:42 crc kubenswrapper[4711]: I1203 12:14:42.830628 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e1d000f05dfb1e5cb6bbd88ee10b313fa525f591708636cc7233465523b350bb"} Dec 03 12:14:42 crc kubenswrapper[4711]: W1203 12:14:42.987552 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:42 crc kubenswrapper[4711]: E1203 12:14:42.987628 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:14:43 crc kubenswrapper[4711]: E1203 12:14:43.160322 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Dec 03 12:14:43 crc kubenswrapper[4711]: W1203 12:14:43.273859 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:43 crc kubenswrapper[4711]: E1203 12:14:43.273982 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.390682 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.392069 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.392137 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.392160 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.392215 4711 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:14:43 crc kubenswrapper[4711]: E1203 12:14:43.392809 4711 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.752038 4711 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.835486 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.835536 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.835548 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.835638 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.836544 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.836581 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.836592 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.837289 4711 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="19ebc39f88b18264eef6d8a0669fcf37e99a4022a642695814ebc4bbe5aab6b6" exitCode=0 Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.837356 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"19ebc39f88b18264eef6d8a0669fcf37e99a4022a642695814ebc4bbe5aab6b6"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.837446 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.838089 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.838115 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.838125 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.838894 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.839004 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.839774 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.839794 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.839803 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.841861 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.841899 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.841941 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.841920 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.842757 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.842782 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.842793 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.846645 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.846676 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.846687 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.846695 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.846704 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0"} Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.846773 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.847384 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.847409 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4711]: I1203 12:14:43.847417 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.851686 4711 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="493f74e33220c512042fce684c72e679e157c58c0ce7ef0c2f4a7649f30822a2" exitCode=0 Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.851801 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"493f74e33220c512042fce684c72e679e157c58c0ce7ef0c2f4a7649f30822a2"} Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.851825 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.852073 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.853492 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.853521 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.853540 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.853554 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.853561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.853584 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.993559 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.996663 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.996794 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.997228 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4711]: I1203 12:14:44.997383 4711 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.692957 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.693068 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.693101 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.694371 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.694420 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.694433 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.858824 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8d6986b32311b7514d8eced07452152af7f2d049a591b7bc5ae2bc0653541ab5"} Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.865356 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.865536 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.865589 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.866970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.867047 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4711]: I1203 12:14:45.867072 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4711]: I1203 12:14:46.214281 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:46 crc kubenswrapper[4711]: I1203 12:14:46.214506 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:46 crc kubenswrapper[4711]: I1203 12:14:46.216125 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4711]: I1203 12:14:46.216184 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4711]: I1203 12:14:46.216203 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4711]: I1203 12:14:46.867629 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ef7f0741f6d439bc5568ed32e6b2a4f4d6980d91d9a542a18a92dd0268a75657"} Dec 03 12:14:46 crc kubenswrapper[4711]: I1203 12:14:46.867708 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b1e77575933be2d338e546c3b1b16557610bf1d392816130718d894b7f139605"} Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.199193 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.199402 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.200526 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.200559 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.200569 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.203971 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.875031 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.875594 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.875939 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fd8e027b4ea6a21fbf7c5d98ec263bbef0f25245516ba19ca3ef2e21fb6851b5"} Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.875975 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3811faf097cf567452cb52f613f23d15ec337fc490c83661bce83c84301854e0"} Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.875994 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.876460 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.876493 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.876504 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.876500 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.876545 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4711]: I1203 12:14:47.876580 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4711]: I1203 12:14:48.876686 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:48 crc kubenswrapper[4711]: I1203 12:14:48.876777 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:48 crc kubenswrapper[4711]: I1203 12:14:48.877883 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4711]: I1203 12:14:48.877957 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4711]: I1203 12:14:48.877966 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4711]: I1203 12:14:48.877979 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4711]: I1203 12:14:48.877984 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4711]: I1203 12:14:48.877988 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.591028 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.591130 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.591424 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.593317 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.593378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.593401 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.878792 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.879691 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.879727 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4711]: I1203 12:14:49.879737 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4711]: E1203 12:14:51.892734 4711 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.457222 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.457357 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.459555 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.459600 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.459613 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.481225 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.481471 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.482770 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.482824 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4711]: I1203 12:14:52.482839 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4711]: I1203 12:14:53.380258 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:14:53 crc kubenswrapper[4711]: I1203 12:14:53.380403 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:53 crc kubenswrapper[4711]: I1203 12:14:53.381352 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4711]: I1203 12:14:53.381401 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4711]: I1203 12:14:53.381412 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4711]: W1203 12:14:54.710728 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 12:14:54 crc kubenswrapper[4711]: I1203 12:14:54.711084 4711 trace.go:236] Trace[1609920049]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 12:14:44.709) (total time: 10001ms): Dec 03 12:14:54 crc kubenswrapper[4711]: Trace[1609920049]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:14:54.710) Dec 03 12:14:54 crc kubenswrapper[4711]: Trace[1609920049]: [10.001488381s] [10.001488381s] END Dec 03 12:14:54 crc kubenswrapper[4711]: E1203 12:14:54.711105 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 12:14:54 crc kubenswrapper[4711]: I1203 12:14:54.751768 4711 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 12:14:54 crc kubenswrapper[4711]: E1203 12:14:54.760979 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 03 12:14:54 crc kubenswrapper[4711]: E1203 12:14:54.999544 4711 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 03 12:14:55 crc kubenswrapper[4711]: W1203 12:14:55.187769 4711 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 12:14:55 crc kubenswrapper[4711]: I1203 12:14:55.187871 4711 trace.go:236] Trace[1710846783]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 12:14:45.186) (total time: 10001ms): Dec 03 12:14:55 crc kubenswrapper[4711]: Trace[1710846783]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (12:14:55.187) Dec 03 12:14:55 crc kubenswrapper[4711]: Trace[1710846783]: [10.001043007s] [10.001043007s] END Dec 03 12:14:55 crc kubenswrapper[4711]: E1203 12:14:55.187895 4711 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 12:14:55 crc kubenswrapper[4711]: I1203 12:14:55.541186 4711 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 12:14:55 crc kubenswrapper[4711]: I1203 12:14:55.541243 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 12:14:55 crc kubenswrapper[4711]: I1203 12:14:55.553459 4711 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 12:14:55 crc kubenswrapper[4711]: I1203 12:14:55.553525 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 12:14:55 crc kubenswrapper[4711]: I1203 12:14:55.876585 4711 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]log ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]etcd ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/priority-and-fairness-filter ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-apiextensions-informers ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-apiextensions-controllers ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/crd-informer-synced ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-system-namespaces-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 03 12:14:55 crc kubenswrapper[4711]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 03 12:14:55 crc kubenswrapper[4711]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/bootstrap-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/start-kube-aggregator-informers ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/apiservice-registration-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/apiservice-discovery-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]autoregister-completion ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/apiservice-openapi-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 03 12:14:55 crc kubenswrapper[4711]: livez check failed Dec 03 12:14:55 crc kubenswrapper[4711]: I1203 12:14:55.876667 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.380515 4711 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.380614 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.503337 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.503620 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.505005 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.505057 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.505075 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.565246 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.895037 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.896007 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.896052 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.896065 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4711]: I1203 12:14:56.912474 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 12:14:57 crc kubenswrapper[4711]: I1203 12:14:57.898243 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:57 crc kubenswrapper[4711]: I1203 12:14:57.899521 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4711]: I1203 12:14:57.899575 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4711]: I1203 12:14:57.899592 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4711]: I1203 12:14:58.200440 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:14:58 crc kubenswrapper[4711]: I1203 12:14:58.202238 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4711]: I1203 12:14:58.202288 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4711]: I1203 12:14:58.202355 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4711]: I1203 12:14:58.202392 4711 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:14:58 crc kubenswrapper[4711]: E1203 12:14:58.208856 4711 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 12:14:58 crc kubenswrapper[4711]: I1203 12:14:58.546325 4711 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 12:14:59 crc kubenswrapper[4711]: I1203 12:14:59.212915 4711 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.556967 4711 trace.go:236] Trace[1999359629]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 12:14:45.745) (total time: 14810ms): Dec 03 12:15:00 crc kubenswrapper[4711]: Trace[1999359629]: ---"Objects listed" error: 14810ms (12:15:00.556) Dec 03 12:15:00 crc kubenswrapper[4711]: Trace[1999359629]: [14.810966575s] [14.810966575s] END Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.557038 4711 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.557235 4711 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.558021 4711 trace.go:236] Trace[1508373690]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 12:14:45.692) (total time: 14865ms): Dec 03 12:15:00 crc kubenswrapper[4711]: Trace[1508373690]: ---"Objects listed" error: 14865ms (12:15:00.557) Dec 03 12:15:00 crc kubenswrapper[4711]: Trace[1508373690]: [14.865339806s] [14.865339806s] END Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.558056 4711 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.599840 4711 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48640->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.600844 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48640->192.168.126.11:17697: read: connection reset by peer" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.753066 4711 apiserver.go:52] "Watching apiserver" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.756570 4711 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.757069 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.757564 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.757707 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.757849 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.757885 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.757707 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.757948 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.757951 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.758055 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.758134 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.760223 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.761101 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.761456 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.761504 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.761576 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.761602 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.761707 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.762034 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.763227 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.788342 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.801383 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.814362 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.827705 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.839826 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.849812 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.856197 4711 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.859116 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.859168 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.859237 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.859278 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.859299 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.859323 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.860417 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.860731 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.859341 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861272 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861497 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861560 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861591 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861637 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861742 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861887 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862007 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862062 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862094 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862121 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862189 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861666 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862356 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862404 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862440 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862468 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862496 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862521 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862625 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862712 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862779 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862886 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862986 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863094 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863122 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863163 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863206 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863271 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863371 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863409 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862347 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861875 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.861910 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862197 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863529 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862210 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862326 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862374 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862488 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862497 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862616 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862712 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863616 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863697 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863725 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863813 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863868 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863901 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863979 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864037 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862722 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862792 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862813 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.862973 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863087 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864259 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864292 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864348 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864381 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864431 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864517 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864552 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864580 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864627 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864671 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864728 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864752 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864775 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864818 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864844 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864884 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864937 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864978 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865015 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865057 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865110 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865154 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865214 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865240 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865425 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865482 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865510 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865554 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865623 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865679 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865705 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865734 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865765 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865791 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863105 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863342 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863441 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863437 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863666 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863762 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863777 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.863884 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864050 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864189 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864320 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864474 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864592 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.864869 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865359 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865370 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.865883 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.866138 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.866353 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.866385 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.866482 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.866543 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.866717 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.866765 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.867007 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.867161 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.867204 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.867447 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.867896 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.868065 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.868279 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.868794 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.868847 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869151 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869192 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869218 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869272 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869307 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869345 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869380 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869412 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869442 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869467 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869499 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869531 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869580 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869611 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869641 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869685 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869715 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869747 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869777 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869801 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869829 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869857 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869881 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869934 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869963 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869992 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870021 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870051 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870081 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870107 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870137 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870165 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870262 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870292 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870318 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870343 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881287 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881323 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881341 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881358 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881374 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881391 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881407 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881422 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881439 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881454 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881472 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881487 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881503 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881519 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881535 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881550 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881566 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881581 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881595 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881611 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881625 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881641 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881656 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881670 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881685 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881701 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881716 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881736 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881752 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881768 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881784 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881802 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881819 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881836 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881850 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881870 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881885 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881900 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881931 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881949 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881968 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.881983 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882002 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882019 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882034 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882049 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882064 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882081 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882098 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882121 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882142 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882158 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882174 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882189 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882205 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882220 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882237 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882252 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882267 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882285 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882301 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882316 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882331 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882352 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882373 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882389 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882404 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882421 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882439 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882455 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882470 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882485 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882502 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882520 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882537 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882554 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882571 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882588 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882659 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882675 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882694 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882710 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.883963 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884149 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884170 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884200 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884217 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884234 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884258 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884490 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884531 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884557 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884575 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884597 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869358 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869444 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869680 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869699 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.869970 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870060 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.870356 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.879274 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.879291 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.879448 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.879827 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.879841 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882547 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882818 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882830 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.879216 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.879199 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882915 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.882961 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.883027 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.883188 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.883309 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.883340 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.883595 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.883635 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.883853 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884049 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884265 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884277 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884867 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884884 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884979 4711 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.884991 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885001 4711 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885010 4711 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885019 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885029 4711 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885039 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885061 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885071 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885079 4711 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885088 4711 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885097 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885107 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885116 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885125 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885118 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885135 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885197 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885217 4711 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885236 4711 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.885298 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:01.385276132 +0000 UTC m=+20.054527387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885332 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885349 4711 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885361 4711 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885379 4711 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885393 4711 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885406 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885419 4711 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885432 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885448 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885461 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885472 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885484 4711 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885496 4711 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885509 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885510 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885523 4711 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885539 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885552 4711 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885565 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885578 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885619 4711 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885633 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885645 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885657 4711 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885672 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885696 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885701 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885709 4711 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885750 4711 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885768 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885783 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885777 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885798 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885802 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.886191 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.886250 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887064 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.885798 4711 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887098 4711 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887112 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887122 4711 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887134 4711 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887147 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887159 4711 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887171 4711 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887185 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887218 4711 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887230 4711 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887243 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887255 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887267 4711 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887277 4711 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887288 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887300 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887313 4711 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887325 4711 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887338 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887350 4711 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887364 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887375 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887386 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887397 4711 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887412 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887425 4711 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887441 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887452 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887463 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887474 4711 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887487 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887499 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887511 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887523 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887536 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887093 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887346 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887419 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887813 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.887897 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.888167 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.888469 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.888578 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.888760 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.888795 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.889449 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.890141 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.890192 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.890196 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.890241 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.890375 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.890846 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.890881 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.891090 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.891185 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.891429 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.891468 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.891557 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.891714 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.892182 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.892222 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.892477 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.892739 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.892775 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.892850 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.893016 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.893052 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.893228 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.893303 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.893574 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.893597 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.893912 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.893951 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.894238 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.894380 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.894380 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.894622 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.894666 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.894763 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.895213 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.895330 4711 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.895377 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.895794 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.896305 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.897156 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.897167 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.897287 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.897334 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.897769 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.897778 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.897498 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.897558 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898007 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898176 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.898367 4711 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.898458 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:01.398435673 +0000 UTC m=+20.067687128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898550 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.898698 4711 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.898742 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:01.398729801 +0000 UTC m=+20.067981266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898543 4711 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898800 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898820 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898899 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898993 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.898723 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.899400 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.899602 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.899741 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.899797 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.899901 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.899749 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.900232 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.894461 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.900325 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.900364 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.900796 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.901112 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.901406 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.901684 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.902347 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.902490 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.904349 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.904846 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.905198 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.906186 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.906358 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.907148 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.909044 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.909627 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.914067 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.915592 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.915629 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.915646 4711 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.915711 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:01.415689623 +0000 UTC m=+20.084941108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.915854 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.917164 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.917742 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.918126 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.918166 4711 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:00 crc kubenswrapper[4711]: E1203 12:15:00.918255 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:01.41822134 +0000 UTC m=+20.087472585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.918744 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.919059 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.906134 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.920693 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.921734 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.922745 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.923088 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.923170 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.923184 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.923655 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.924360 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.924654 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.924948 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.925043 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.925144 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.925809 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.925813 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.926285 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.926606 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.926741 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.927113 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.928021 4711 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054" exitCode=255 Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.928077 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054"} Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.928611 4711 scope.go:117] "RemoveContainer" containerID="c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.928899 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.933740 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.936979 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.947905 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.958042 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.962463 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.962474 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.974580 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.985796 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988280 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988334 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988414 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988427 4711 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988437 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988449 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988460 4711 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988472 4711 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988484 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988495 4711 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988504 4711 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988513 4711 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988525 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988534 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988544 4711 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988553 4711 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988563 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988577 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988585 4711 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988594 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988604 4711 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988622 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988649 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988659 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988669 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988678 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988687 4711 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988695 4711 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988704 4711 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988712 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988720 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988728 4711 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988737 4711 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988745 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988753 4711 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988761 4711 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988769 4711 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988777 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988785 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988794 4711 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988803 4711 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988811 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988819 4711 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988826 4711 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988834 4711 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988843 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988851 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988860 4711 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988868 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988876 4711 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988883 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988891 4711 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988899 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988925 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988940 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988950 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988958 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988965 4711 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988974 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988985 4711 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.988993 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989000 4711 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989008 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989016 4711 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989024 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989032 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989040 4711 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989048 4711 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989055 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989066 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989075 4711 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989088 4711 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989103 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989113 4711 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989123 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989133 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989143 4711 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989154 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989166 4711 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989176 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989187 4711 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989199 4711 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989210 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989220 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989228 4711 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989235 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989651 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989243 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989711 4711 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989721 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989730 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989738 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989757 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989765 4711 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989773 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989781 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989791 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989799 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989807 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989815 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989823 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989831 4711 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989839 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989847 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989855 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989864 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989872 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989880 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989888 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989896 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989922 4711 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989930 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.989941 4711 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:00 crc kubenswrapper[4711]: I1203 12:15:00.990112 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.003374 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.012877 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.025014 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.036442 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.048171 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.058035 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.071143 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.075337 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.078318 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.084580 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:15:01 crc kubenswrapper[4711]: W1203 12:15:01.085367 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c33bac1b356b887ed9383f82927e8ee62238a0641381cf273e1caba6cfb8d4d6 WatchSource:0}: Error finding container c33bac1b356b887ed9383f82927e8ee62238a0641381cf273e1caba6cfb8d4d6: Status 404 returned error can't find the container with id c33bac1b356b887ed9383f82927e8ee62238a0641381cf273e1caba6cfb8d4d6 Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.086502 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: W1203 12:15:01.088564 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9e5e2d6c0848aee37d25f345e1a72dad6a26efbace6d9018bb39662054473bf0 WatchSource:0}: Error finding container 9e5e2d6c0848aee37d25f345e1a72dad6a26efbace6d9018bb39662054473bf0: Status 404 returned error can't find the container with id 9e5e2d6c0848aee37d25f345e1a72dad6a26efbace6d9018bb39662054473bf0 Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.099162 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.150655 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.393343 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.393584 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:02.39353801 +0000 UTC m=+21.062789295 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.494940 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.495006 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.495037 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.495075 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495102 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495129 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495140 4711 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495138 4711 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495192 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:02.49517549 +0000 UTC m=+21.164426745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495218 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495243 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495259 4711 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495259 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:02.495220911 +0000 UTC m=+21.164472216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495332 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:02.495313883 +0000 UTC m=+21.164565148 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495359 4711 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:01 crc kubenswrapper[4711]: E1203 12:15:01.495414 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:02.495386075 +0000 UTC m=+21.164637450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.820747 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.821350 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.823237 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.824057 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.825284 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.825949 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.826548 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.827955 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.828078 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.828763 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.830080 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.830615 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.831731 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.832448 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.834700 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.835463 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.836659 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.837441 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.837934 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.839137 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.839788 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.840347 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.841555 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.842083 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.843355 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.843842 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.843963 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.845092 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.845750 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.846710 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.847277 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.848144 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.848586 4711 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.848686 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.850839 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.851790 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.852199 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.854067 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.854580 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.855100 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.855590 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.856544 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.857221 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.858049 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.858620 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.859676 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.860273 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.861168 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.861689 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.862559 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.863272 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.863759 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.864124 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.864555 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.865485 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.866036 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.866637 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.867442 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.870990 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.880183 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.893346 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.931522 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.932843 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c"} Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.933063 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.933992 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"24c3fe183c82f7ca1c57af3fbd3b4a23067c2fc56bb60438aaf6a49d8597b175"} Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.935339 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10"} Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.935373 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9e5e2d6c0848aee37d25f345e1a72dad6a26efbace6d9018bb39662054473bf0"} Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.936685 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c33bac1b356b887ed9383f82927e8ee62238a0641381cf273e1caba6cfb8d4d6"} Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.950182 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.980066 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:01 crc kubenswrapper[4711]: I1203 12:15:01.991891 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.005035 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.016849 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.028862 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.050021 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.099097 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-g4t8g"] Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.099392 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g4t8g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.102793 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.102834 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.103241 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.117551 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.127212 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.139555 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.146509 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.156164 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.168220 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.180013 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.196316 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.200451 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfh4n\" (UniqueName: \"kubernetes.io/projected/0fe37859-a67f-4f4b-9c03-57db1ba5e5e9-kube-api-access-qfh4n\") pod \"node-resolver-g4t8g\" (UID: \"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\") " pod="openshift-dns/node-resolver-g4t8g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.200513 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0fe37859-a67f-4f4b-9c03-57db1ba5e5e9-hosts-file\") pod \"node-resolver-g4t8g\" (UID: \"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\") " pod="openshift-dns/node-resolver-g4t8g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.301450 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfh4n\" (UniqueName: \"kubernetes.io/projected/0fe37859-a67f-4f4b-9c03-57db1ba5e5e9-kube-api-access-qfh4n\") pod \"node-resolver-g4t8g\" (UID: \"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\") " pod="openshift-dns/node-resolver-g4t8g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.301508 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0fe37859-a67f-4f4b-9c03-57db1ba5e5e9-hosts-file\") pod \"node-resolver-g4t8g\" (UID: \"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\") " pod="openshift-dns/node-resolver-g4t8g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.301593 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0fe37859-a67f-4f4b-9c03-57db1ba5e5e9-hosts-file\") pod \"node-resolver-g4t8g\" (UID: \"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\") " pod="openshift-dns/node-resolver-g4t8g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.317189 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfh4n\" (UniqueName: \"kubernetes.io/projected/0fe37859-a67f-4f4b-9c03-57db1ba5e5e9-kube-api-access-qfh4n\") pod \"node-resolver-g4t8g\" (UID: \"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\") " pod="openshift-dns/node-resolver-g4t8g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.402241 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.402383 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:04.402367753 +0000 UTC m=+23.071619008 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.410727 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g4t8g" Dec 03 12:15:02 crc kubenswrapper[4711]: W1203 12:15:02.429562 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe37859_a67f_4f4b_9c03_57db1ba5e5e9.slice/crio-1791733b80d35907dc017e3566626782c2b94650faf84d71dea98b9cecd4fd24 WatchSource:0}: Error finding container 1791733b80d35907dc017e3566626782c2b94650faf84d71dea98b9cecd4fd24: Status 404 returned error can't find the container with id 1791733b80d35907dc017e3566626782c2b94650faf84d71dea98b9cecd4fd24 Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.503462 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.503496 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.503518 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.503541 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503607 4711 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503655 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503668 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503670 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:04.503655422 +0000 UTC m=+23.172906677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503678 4711 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503704 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:04.503696463 +0000 UTC m=+23.172947718 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503732 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503786 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503800 4711 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503818 4711 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503860 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:04.503839207 +0000 UTC m=+23.173090482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.503969 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:04.50394044 +0000 UTC m=+23.173191695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.546251 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-52jgg"] Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.546608 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.548379 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.548394 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.548572 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.548570 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.550219 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.559886 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.572665 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.602548 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.604969 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-mcd-auth-proxy-config\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.604998 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmcg\" (UniqueName: \"kubernetes.io/projected/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-kube-api-access-7rmcg\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.605051 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-proxy-tls\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.605069 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-rootfs\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.640021 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.669668 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.694267 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.706351 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-proxy-tls\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.706588 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-rootfs\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.706674 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-mcd-auth-proxy-config\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.706727 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-rootfs\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.706767 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmcg\" (UniqueName: \"kubernetes.io/projected/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-kube-api-access-7rmcg\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.707435 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-mcd-auth-proxy-config\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.711332 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-proxy-tls\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.714531 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.724100 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmcg\" (UniqueName: \"kubernetes.io/projected/776e7d35-d59b-4d4a-97cd-aec4f2441c1e-kube-api-access-7rmcg\") pod \"machine-config-daemon-52jgg\" (UID: \"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\") " pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.731317 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.743387 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.816513 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.816564 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.816595 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.816644 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.816799 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:02 crc kubenswrapper[4711]: E1203 12:15:02.816873 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.859670 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:15:02 crc kubenswrapper[4711]: W1203 12:15:02.871821 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod776e7d35_d59b_4d4a_97cd_aec4f2441c1e.slice/crio-6a422b0ed1a4e01c142e27a6660a57f53d52d9dbb90f2a6d91095d9cf165aaf9 WatchSource:0}: Error finding container 6a422b0ed1a4e01c142e27a6660a57f53d52d9dbb90f2a6d91095d9cf165aaf9: Status 404 returned error can't find the container with id 6a422b0ed1a4e01c142e27a6660a57f53d52d9dbb90f2a6d91095d9cf165aaf9 Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.941570 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944"} Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.942884 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"6a422b0ed1a4e01c142e27a6660a57f53d52d9dbb90f2a6d91095d9cf165aaf9"} Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.944176 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109"} Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.945953 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g4t8g" event={"ID":"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9","Type":"ContainerStarted","Data":"da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82"} Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.946003 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g4t8g" event={"ID":"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9","Type":"ContainerStarted","Data":"1791733b80d35907dc017e3566626782c2b94650faf84d71dea98b9cecd4fd24"} Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.963275 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.976979 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gwhcr"] Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.977425 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gwhcr" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.977496 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-sfpts"] Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.983883 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.984261 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.984890 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ct6xt"] Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.986953 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.987024 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.987076 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.987149 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.987228 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.988599 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.991793 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.991803 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.991897 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.993637 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.993642 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.993705 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.993872 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.998856 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 12:15:02 crc kubenswrapper[4711]: I1203 12:15:02.999933 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.006602 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.055993 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.079877 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.094143 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.108943 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109435 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-kubelet\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109478 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-kubelet\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109503 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-cnibin\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109528 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109583 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-script-lib\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109622 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-netns\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109640 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-etc-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109662 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-systemd\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109680 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109697 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-ovn\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109713 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-socket-dir-parent\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109734 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-k8s-cni-cncf-io\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109752 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-netns\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109772 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-os-release\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109810 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-var-lib-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109829 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-slash\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109866 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-daemon-config\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109881 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-node-log\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109896 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-log-socket\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109938 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-netd\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109956 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-env-overrides\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.109972 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovn-node-metrics-cert\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110015 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh5vs\" (UniqueName: \"kubernetes.io/projected/33d2332f-fdac-42be-891e-7eaef0e7ca9d-kube-api-access-vh5vs\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110033 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110050 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-systemd-units\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110091 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cnibin\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110117 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-cni-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110132 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-conf-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110171 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-config\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110192 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px98s\" (UniqueName: \"kubernetes.io/projected/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-kube-api-access-px98s\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110221 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztlqn\" (UniqueName: \"kubernetes.io/projected/216c3ac8-462c-49ec-87a2-c935d0c4ad25-kube-api-access-ztlqn\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110238 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-system-cni-dir\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110253 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-cni-multus\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110268 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cni-binary-copy\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110283 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/216c3ac8-462c-49ec-87a2-c935d0c4ad25-cni-binary-copy\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110300 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-multus-certs\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110320 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110346 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-bin\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110365 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-system-cni-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110405 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-hostroot\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110438 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110464 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-cni-bin\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110489 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-etc-kubernetes\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.110518 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-os-release\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.122655 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.135556 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.151225 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.167315 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.180015 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.190554 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.202163 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211201 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cnibin\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211251 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-cni-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211273 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-conf-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211299 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-systemd-units\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211320 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-config\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211359 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px98s\" (UniqueName: \"kubernetes.io/projected/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-kube-api-access-px98s\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211384 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztlqn\" (UniqueName: \"kubernetes.io/projected/216c3ac8-462c-49ec-87a2-c935d0c4ad25-kube-api-access-ztlqn\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211406 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-system-cni-dir\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211405 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-systemd-units\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211348 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cnibin\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211477 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-conf-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211430 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-cni-multus\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211549 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-cni-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211564 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/216c3ac8-462c-49ec-87a2-c935d0c4ad25-cni-binary-copy\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211569 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-cni-multus\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211511 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-system-cni-dir\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211641 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-multus-certs\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211699 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211725 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-bin\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211731 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-multus-certs\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211750 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cni-binary-copy\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211776 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-system-cni-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211782 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-bin\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211750 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211833 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-hostroot\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211939 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-cni-bin\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211953 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-hostroot\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211966 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-etc-kubernetes\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.211997 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212003 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-cni-bin\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212044 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-os-release\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212071 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-system-cni-dir\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212043 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-etc-kubernetes\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212114 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-kubelet\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212162 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-config\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212187 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-kubelet\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212214 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-cnibin\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212236 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-script-lib\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212259 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-netns\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212283 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-etc-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212288 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-kubelet\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212307 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212310 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-cnibin\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212334 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-netns\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212358 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-var-lib-kubelet\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212390 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-etc-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212429 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212508 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212573 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-ovn\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212588 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212640 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-ovn\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212664 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-socket-dir-parent\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212702 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-systemd\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212725 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-netns\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212735 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-socket-dir-parent\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212748 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-k8s-cni-cncf-io\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212768 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-systemd\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212799 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-var-lib-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212822 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-os-release\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212825 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-netns\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212799 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-host-run-k8s-cni-cncf-io\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212888 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-os-release\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212870 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-slash\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212844 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-slash\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212952 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-var-lib-openvswitch\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212964 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-daemon-config\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212512 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/216c3ac8-462c-49ec-87a2-c935d0c4ad25-os-release\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.212992 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-log-socket\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213019 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-netd\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213043 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-env-overrides\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213054 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-log-socket\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213067 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovn-node-metrics-cert\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213087 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-netd\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213105 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-node-log\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213131 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh5vs\" (UniqueName: \"kubernetes.io/projected/33d2332f-fdac-42be-891e-7eaef0e7ca9d-kube-api-access-vh5vs\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213156 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213313 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-node-log\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213449 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cni-binary-copy\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213593 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/216c3ac8-462c-49ec-87a2-c935d0c4ad25-cni-binary-copy\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213598 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.213684 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.214037 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-script-lib\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.216868 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-env-overrides\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.217369 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/216c3ac8-462c-49ec-87a2-c935d0c4ad25-multus-daemon-config\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.229190 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovn-node-metrics-cert\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.229664 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztlqn\" (UniqueName: \"kubernetes.io/projected/216c3ac8-462c-49ec-87a2-c935d0c4ad25-kube-api-access-ztlqn\") pod \"multus-gwhcr\" (UID: \"216c3ac8-462c-49ec-87a2-c935d0c4ad25\") " pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.230430 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh5vs\" (UniqueName: \"kubernetes.io/projected/33d2332f-fdac-42be-891e-7eaef0e7ca9d-kube-api-access-vh5vs\") pod \"ovnkube-node-ct6xt\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.237159 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px98s\" (UniqueName: \"kubernetes.io/projected/d9f176e2-84b4-4f7d-bf31-94ecb9f59e90-kube-api-access-px98s\") pod \"multus-additional-cni-plugins-sfpts\" (UID: \"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\") " pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.237947 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.255362 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.268086 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.282798 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.299372 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.300408 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gwhcr" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.310857 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sfpts" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.313800 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: W1203 12:15:03.314809 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod216c3ac8_462c_49ec_87a2_c935d0c4ad25.slice/crio-35c27fe406ec409682172aadebe820e8270cf6dd15e71cbf27f441beb390b736 WatchSource:0}: Error finding container 35c27fe406ec409682172aadebe820e8270cf6dd15e71cbf27f441beb390b736: Status 404 returned error can't find the container with id 35c27fe406ec409682172aadebe820e8270cf6dd15e71cbf27f441beb390b736 Dec 03 12:15:03 crc kubenswrapper[4711]: W1203 12:15:03.322178 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f176e2_84b4_4f7d_bf31_94ecb9f59e90.slice/crio-104c1c72ba82e8df0606afd89fccca319f936b9c6d8f015f03098145fda05d32 WatchSource:0}: Error finding container 104c1c72ba82e8df0606afd89fccca319f936b9c6d8f015f03098145fda05d32: Status 404 returned error can't find the container with id 104c1c72ba82e8df0606afd89fccca319f936b9c6d8f015f03098145fda05d32 Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.333005 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.335055 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:03 crc kubenswrapper[4711]: W1203 12:15:03.355995 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d2332f_fdac_42be_891e_7eaef0e7ca9d.slice/crio-8e1603daf0b3d5bfc8122a5f6be62ae00b8257d9451d66d7ea4e101f253b4020 WatchSource:0}: Error finding container 8e1603daf0b3d5bfc8122a5f6be62ae00b8257d9451d66d7ea4e101f253b4020: Status 404 returned error can't find the container with id 8e1603daf0b3d5bfc8122a5f6be62ae00b8257d9451d66d7ea4e101f253b4020 Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.387167 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.393514 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.401939 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.408670 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.424325 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.438368 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.451389 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.463071 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.484724 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.504888 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.521795 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.537814 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.551371 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.568511 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.601933 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.621586 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.636039 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.654182 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.667370 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.680140 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.691801 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.702744 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.723210 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.736586 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.755244 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.785828 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.802096 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.821145 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.953944 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.962274 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.962221 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2" exitCode=0 Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.963180 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"8e1603daf0b3d5bfc8122a5f6be62ae00b8257d9451d66d7ea4e101f253b4020"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.965101 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwhcr" event={"ID":"216c3ac8-462c-49ec-87a2-c935d0c4ad25","Type":"ContainerStarted","Data":"4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.965158 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwhcr" event={"ID":"216c3ac8-462c-49ec-87a2-c935d0c4ad25","Type":"ContainerStarted","Data":"35c27fe406ec409682172aadebe820e8270cf6dd15e71cbf27f441beb390b736"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.970508 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.970556 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.973136 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerStarted","Data":"d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.973175 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerStarted","Data":"104c1c72ba82e8df0606afd89fccca319f936b9c6d8f015f03098145fda05d32"} Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.975517 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:03 crc kubenswrapper[4711]: E1203 12:15:03.979451 4711 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:15:03 crc kubenswrapper[4711]: I1203 12:15:03.988315 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.000449 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.011975 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.025720 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.042523 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.057968 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.078077 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.092558 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.111533 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.125399 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.143808 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.194038 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.235551 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.263294 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.303380 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.342238 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.381683 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.421086 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.423368 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.423533 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:08.423510509 +0000 UTC m=+27.092761764 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.469357 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.503142 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.524277 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.524327 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.524352 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.524383 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524497 4711 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524548 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:08.524530612 +0000 UTC m=+27.193781867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524554 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524591 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524590 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524607 4711 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524622 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524634 4711 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524672 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:08.524653605 +0000 UTC m=+27.193904870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524694 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:08.524684396 +0000 UTC m=+27.193935771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524820 4711 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.524870 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:08.52485856 +0000 UTC m=+27.194109915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.541982 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.584692 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.609783 4711 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.611622 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.611695 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.611711 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.611857 4711 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.628705 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.676195 4711 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.676470 4711 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.677406 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.677436 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.677444 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.677457 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.677466 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.696431 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.700669 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.700710 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.700720 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.700736 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.700747 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.708822 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.712128 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.714970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.715014 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.715027 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.715043 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.715054 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.731945 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.735473 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.735515 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.735526 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.735543 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.735554 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.745350 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.748223 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.751644 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.751686 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.751697 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.751714 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.751726 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.764253 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.764365 4711 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.766094 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.766126 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.766137 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.766151 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.766161 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.816183 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.816518 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.816196 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.816192 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.816605 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:04 crc kubenswrapper[4711]: E1203 12:15:04.816665 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.869085 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.869139 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.869150 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.869168 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.869179 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.971711 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.971762 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.971773 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.971790 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.971800 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.976674 4711 generic.go:334] "Generic (PLEG): container finished" podID="d9f176e2-84b4-4f7d-bf31-94ecb9f59e90" containerID="d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea" exitCode=0 Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.976765 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerDied","Data":"d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.981554 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.981614 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.981630 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.981643 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.981653 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.981665 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} Dec 03 12:15:04 crc kubenswrapper[4711]: I1203 12:15:04.991643 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.004288 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.017709 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.028315 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.040648 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.049587 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.066045 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.074296 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.074340 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.074353 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.074369 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.074380 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.079250 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.105883 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.142433 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.176082 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.176134 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.176146 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.176163 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.176176 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.182014 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.223889 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-n9ptm"] Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.224318 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.226227 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.234534 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.255790 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.275395 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.277745 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.277794 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.277806 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.277823 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.277834 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.294613 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.330343 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4b974b5-a578-4c3c-b6a0-0038d19cb565-serviceca\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.330406 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4b974b5-a578-4c3c-b6a0-0038d19cb565-host\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.330443 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcvh\" (UniqueName: \"kubernetes.io/projected/c4b974b5-a578-4c3c-b6a0-0038d19cb565-kube-api-access-fpcvh\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.344580 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.381110 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.381146 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.381155 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.381180 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.381190 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.384409 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.423311 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.432018 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4b974b5-a578-4c3c-b6a0-0038d19cb565-serviceca\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.432116 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4b974b5-a578-4c3c-b6a0-0038d19cb565-host\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.432168 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcvh\" (UniqueName: \"kubernetes.io/projected/c4b974b5-a578-4c3c-b6a0-0038d19cb565-kube-api-access-fpcvh\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.432272 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4b974b5-a578-4c3c-b6a0-0038d19cb565-host\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.432991 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4b974b5-a578-4c3c-b6a0-0038d19cb565-serviceca\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.475120 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcvh\" (UniqueName: \"kubernetes.io/projected/c4b974b5-a578-4c3c-b6a0-0038d19cb565-kube-api-access-fpcvh\") pod \"node-ca-n9ptm\" (UID: \"c4b974b5-a578-4c3c-b6a0-0038d19cb565\") " pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.482582 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.483087 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.483133 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.483146 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.483162 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.483173 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.530642 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.536877 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n9ptm" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.560928 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: W1203 12:15:05.583392 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4b974b5_a578_4c3c_b6a0_0038d19cb565.slice/crio-5b8b2c5122990fe95dc9b0ec18cbc60c9e2c4cdc3c8dddd1d2a397cb01b58b2b WatchSource:0}: Error finding container 5b8b2c5122990fe95dc9b0ec18cbc60c9e2c4cdc3c8dddd1d2a397cb01b58b2b: Status 404 returned error can't find the container with id 5b8b2c5122990fe95dc9b0ec18cbc60c9e2c4cdc3c8dddd1d2a397cb01b58b2b Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.585412 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.585445 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.585457 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.585472 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.585480 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.603484 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.643178 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.681451 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.689042 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.689103 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.689111 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.689149 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.689159 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.722639 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.763593 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.792362 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.792393 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.792401 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.792414 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.792423 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.804061 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.843376 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.890228 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.894926 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.894963 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.894974 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.894990 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.894999 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.924160 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.986654 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n9ptm" event={"ID":"c4b974b5-a578-4c3c-b6a0-0038d19cb565","Type":"ContainerStarted","Data":"251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.986703 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n9ptm" event={"ID":"c4b974b5-a578-4c3c-b6a0-0038d19cb565","Type":"ContainerStarted","Data":"5b8b2c5122990fe95dc9b0ec18cbc60c9e2c4cdc3c8dddd1d2a397cb01b58b2b"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.988225 4711 generic.go:334] "Generic (PLEG): container finished" podID="d9f176e2-84b4-4f7d-bf31-94ecb9f59e90" containerID="595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3" exitCode=0 Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.988253 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerDied","Data":"595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3"} Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.997479 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.997510 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.997518 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.997531 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4711]: I1203 12:15:05.997540 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.000169 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.011570 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.049534 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.085815 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.100460 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.100498 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.100509 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.100525 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.100537 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.143517 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.169425 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.202579 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.202621 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.202630 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.202647 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.202656 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.203493 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.243524 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.283650 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.305853 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.305902 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.305946 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.305970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.305986 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.321438 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.362472 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.404829 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.408201 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.408230 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.408239 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.408254 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.408264 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.442131 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.484827 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.510258 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.510302 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.510314 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.510332 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.510345 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.528230 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.564649 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.600757 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.612141 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.612174 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.612182 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.612195 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.612203 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.642352 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.683707 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.714216 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.714256 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.714278 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.714306 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.714316 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.729728 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.762065 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.803848 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.815792 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.815828 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.815840 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.815856 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.815866 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.816400 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.816437 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.816441 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:06 crc kubenswrapper[4711]: E1203 12:15:06.816524 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:06 crc kubenswrapper[4711]: E1203 12:15:06.816698 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:06 crc kubenswrapper[4711]: E1203 12:15:06.816774 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.842531 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.882621 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.917595 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.917636 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.917660 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.917681 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.917698 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.923502 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.962346 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.994196 4711 generic.go:334] "Generic (PLEG): container finished" podID="d9f176e2-84b4-4f7d-bf31-94ecb9f59e90" containerID="551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566" exitCode=0 Dec 03 12:15:06 crc kubenswrapper[4711]: I1203 12:15:06.994254 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerDied","Data":"551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.004555 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.019295 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.019329 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.019339 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.019353 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.019363 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.043036 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.082179 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.121215 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.121374 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.121413 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.121424 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.121443 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.121455 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.170699 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.205190 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.223239 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.223280 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.223289 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.223304 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.223315 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.243686 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.284958 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.326100 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.326148 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.326159 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.326178 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.326188 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.327490 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.362944 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.403466 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.428873 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.428927 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.428938 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.428952 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.428960 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.443460 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.482402 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.524668 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.531452 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.531501 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.531513 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.531532 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.531549 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.563827 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.606358 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.633889 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.633966 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.633978 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.633997 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.634009 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.736993 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.737026 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.737035 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.737048 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.737057 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.839669 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.839894 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.839936 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.839952 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.839964 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.942429 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.942468 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.942478 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.942496 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.942507 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.999507 4711 generic.go:334] "Generic (PLEG): container finished" podID="d9f176e2-84b4-4f7d-bf31-94ecb9f59e90" containerID="5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c" exitCode=0 Dec 03 12:15:07 crc kubenswrapper[4711]: I1203 12:15:07.999574 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerDied","Data":"5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.002854 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.014537 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.025534 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.045561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.045601 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.045614 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.045631 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.045643 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.047251 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.062737 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.075947 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.087277 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.098511 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.113906 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.154127 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.154159 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.154167 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.154180 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.154189 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.156465 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.186614 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.199968 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.215715 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.228956 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.239105 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.256228 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.256270 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.256283 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.256299 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.256312 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.358481 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.358526 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.358543 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.358566 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.358577 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.459558 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.459775 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:16.459739272 +0000 UTC m=+35.128990587 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.460684 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.460733 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.460744 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.460759 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.460771 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.560986 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.561033 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.561059 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.561093 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561184 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561186 4711 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561185 4711 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561239 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:16.561224157 +0000 UTC m=+35.230475412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561304 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:16.561279548 +0000 UTC m=+35.230530843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561200 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561339 4711 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561387 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:16.561375091 +0000 UTC m=+35.230626356 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561217 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561466 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561480 4711 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.561520 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:16.561507654 +0000 UTC m=+35.230758969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.562684 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.562709 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.562719 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.562733 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.562743 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.664667 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.664717 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.664728 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.664745 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.664757 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.766390 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.766428 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.766437 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.766451 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.766461 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.816350 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.816391 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.816476 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.816638 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.816763 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:08 crc kubenswrapper[4711]: E1203 12:15:08.816883 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.868472 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.868521 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.868537 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.868558 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.868573 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.974235 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.974268 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.974276 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.974289 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4711]: I1203 12:15:08.974300 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.008735 4711 generic.go:334] "Generic (PLEG): container finished" podID="d9f176e2-84b4-4f7d-bf31-94ecb9f59e90" containerID="c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007" exitCode=0 Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.008792 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerDied","Data":"c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.026948 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.040816 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.053864 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.071114 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.076178 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.076219 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.076231 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.076247 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.076261 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.083187 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.093991 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.107423 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.119601 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.129211 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.145798 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.159632 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.171338 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.178172 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.178206 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.178217 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.178233 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.178245 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.183640 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.200574 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:09Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.280682 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.280727 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.280740 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.280759 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.280772 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.383012 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.383349 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.383360 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.383377 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.383389 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.485847 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.485943 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.485970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.485995 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.486013 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.588390 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.588426 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.588437 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.588455 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.588466 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.690617 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.690647 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.690655 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.690667 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.690676 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.793258 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.793329 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.793351 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.793381 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.793403 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.895657 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.895686 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.895696 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.895711 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.895722 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.999071 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.999130 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.999144 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.999163 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4711]: I1203 12:15:09.999177 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.101209 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.101255 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.101264 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.101549 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.101578 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.204748 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.204801 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.204818 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.204840 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.204857 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.307718 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.307778 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.307794 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.307818 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.307835 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.411017 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.411306 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.411532 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.411811 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.412170 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.515342 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.515397 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.515405 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.515419 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.515428 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.617974 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.618036 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.618056 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.618089 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.618115 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.721064 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.721156 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.721171 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.721191 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.721206 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.816775 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.816847 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:10 crc kubenswrapper[4711]: E1203 12:15:10.816950 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.816969 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:10 crc kubenswrapper[4711]: E1203 12:15:10.817105 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:10 crc kubenswrapper[4711]: E1203 12:15:10.817219 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.823617 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.823650 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.823659 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.823673 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.823684 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.926645 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.926686 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.926698 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.926716 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4711]: I1203 12:15:10.926728 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.020172 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.020553 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.024083 4711 generic.go:334] "Generic (PLEG): container finished" podID="d9f176e2-84b4-4f7d-bf31-94ecb9f59e90" containerID="a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a" exitCode=0 Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.024120 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerDied","Data":"a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.028972 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.029007 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.029020 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.029035 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.029048 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.033579 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.045503 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.045897 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.058484 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.071263 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.084203 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.098853 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.112455 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.125132 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.133830 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.133873 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.133885 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.133901 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.133931 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.139740 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.153813 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.175515 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.190969 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.202659 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.221370 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.233528 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.236633 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.236686 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.236702 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.236728 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.236755 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.246310 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.261522 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.277572 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.293356 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.307811 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.319003 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.330173 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.338777 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.338810 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.338819 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.338833 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.338843 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.342127 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.353038 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.363328 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.372508 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.380820 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.395319 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.441279 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.441316 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.441325 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.441339 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.441349 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.543845 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.543891 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.543921 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.543939 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.543951 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.646420 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.646465 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.646477 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.646502 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.646515 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.749123 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.749376 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.749385 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.749400 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.749409 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.832006 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.852258 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.852299 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.852310 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.852327 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.852339 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.854260 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.866499 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.876713 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.887954 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.901601 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.916745 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.933780 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.945192 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.954037 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.954086 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.954096 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.954111 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.954120 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.962416 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.975789 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4711]: I1203 12:15:11.988965 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.006309 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.016179 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.033212 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.033297 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" event={"ID":"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90","Type":"ContainerStarted","Data":"80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.033966 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.048958 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.054231 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.055946 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.055980 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.055991 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.056008 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.056019 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.063017 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.080677 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.093361 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.110440 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.135474 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.152717 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.158462 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.158707 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.158791 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.158867 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.158962 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.162985 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.182844 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.193717 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.203330 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.214423 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.225159 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.240331 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.253400 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.261249 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.261280 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.261287 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.261306 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.261319 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.268828 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.279256 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.292473 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.305258 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.316279 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.326002 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.337666 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.347541 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.363609 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.363642 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.363652 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.363667 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.363681 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.365177 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.382105 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.393475 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.405232 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.419493 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.466369 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.466405 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.466416 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.466434 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.466446 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.568684 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.568721 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.568730 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.568744 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.568754 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.670998 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.671035 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.671043 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.671059 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.671067 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.774015 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.774080 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.774097 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.774120 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.774138 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.816303 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.816418 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.816563 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:12 crc kubenswrapper[4711]: E1203 12:15:12.816552 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:12 crc kubenswrapper[4711]: E1203 12:15:12.816665 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:12 crc kubenswrapper[4711]: E1203 12:15:12.816750 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.876410 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.876448 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.876456 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.876470 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.876479 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.978555 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.978610 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.978625 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.978641 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4711]: I1203 12:15:12.978654 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.037983 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/0.log" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.041148 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c" exitCode=1 Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.041197 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.042065 4711 scope.go:117] "RemoveContainer" containerID="c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.057210 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.071714 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.080712 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.080768 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.080784 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.080806 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.080823 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.085355 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.099395 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.109648 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.123734 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.146585 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.156688 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.167502 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.178642 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.182298 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.182333 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.182341 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.182353 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.182361 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.191632 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.202814 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.218886 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"message\\\":\\\"rvice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622262 6000 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622525 6000 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622583 6000 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622687 6000 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622715 6000 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:12.623100 6000 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 12:15:12.623115 6000 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 12:15:12.623152 6000 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:12.623170 6000 factory.go:656] Stopping watch factory\\\\nI1203 12:15:12.623180 6000 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 12:15:12.623187 6000 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:12.623189 6000 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.231464 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:13Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.285877 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.285940 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.285951 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.285969 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.285981 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.388129 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.388166 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.388176 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.388191 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.388204 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.490728 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.490766 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.490774 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.490789 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.490798 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.593235 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.593270 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.593279 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.593291 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.593300 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.696657 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.696708 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.696718 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.696739 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.696754 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.800151 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.800200 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.800212 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.800235 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.800249 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.903365 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.903428 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.903439 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.903479 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4711]: I1203 12:15:13.903494 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.006058 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.006389 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.006398 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.006411 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.006420 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.051832 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/0.log" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.054789 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.054878 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.072459 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.086676 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.097176 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.108745 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.108780 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.108790 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.108805 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.108814 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.109610 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.120120 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.130473 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.143830 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.169521 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.189579 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.211345 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.211378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.211389 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.211406 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.211418 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.218628 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"message\\\":\\\"rvice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622262 6000 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622525 6000 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622583 6000 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622687 6000 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622715 6000 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:12.623100 6000 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 12:15:12.623115 6000 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 12:15:12.623152 6000 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:12.623170 6000 factory.go:656] Stopping watch factory\\\\nI1203 12:15:12.623180 6000 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 12:15:12.623187 6000 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:12.623189 6000 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.231338 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.242145 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.254853 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.270041 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.314488 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.314572 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.314584 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.314600 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.314613 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.417589 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.417634 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.417647 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.417669 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.417684 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.520362 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.520407 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.520418 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.520435 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.520445 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.623412 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.623458 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.623470 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.623488 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.623500 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.726282 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.726315 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.726325 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.726341 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.726352 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.770510 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m"] Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.771331 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.775222 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.776375 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.793830 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.808776 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.817029 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.817029 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:14 crc kubenswrapper[4711]: E1203 12:15:14.817201 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.817045 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:14 crc kubenswrapper[4711]: E1203 12:15:14.817343 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:14 crc kubenswrapper[4711]: E1203 12:15:14.817471 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.820420 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.825465 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pll8b\" (UniqueName: \"kubernetes.io/projected/114241a2-fe6b-43a6-957c-d215ce55737a-kube-api-access-pll8b\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.825514 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/114241a2-fe6b-43a6-957c-d215ce55737a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.825583 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/114241a2-fe6b-43a6-957c-d215ce55737a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.825607 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/114241a2-fe6b-43a6-957c-d215ce55737a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.828213 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.828241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.828249 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.828262 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.828272 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.834044 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.846595 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.860623 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.875128 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.884376 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.895066 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.905315 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.922322 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"message\\\":\\\"rvice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622262 6000 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622525 6000 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622583 6000 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622687 6000 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622715 6000 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:12.623100 6000 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 12:15:12.623115 6000 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 12:15:12.623152 6000 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:12.623170 6000 factory.go:656] Stopping watch factory\\\\nI1203 12:15:12.623180 6000 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 12:15:12.623187 6000 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:12.623189 6000 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.926093 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pll8b\" (UniqueName: \"kubernetes.io/projected/114241a2-fe6b-43a6-957c-d215ce55737a-kube-api-access-pll8b\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.926158 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/114241a2-fe6b-43a6-957c-d215ce55737a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.926226 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/114241a2-fe6b-43a6-957c-d215ce55737a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.926250 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/114241a2-fe6b-43a6-957c-d215ce55737a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.926887 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/114241a2-fe6b-43a6-957c-d215ce55737a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.927218 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/114241a2-fe6b-43a6-957c-d215ce55737a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.930610 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.930639 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.930652 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.930666 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.930675 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.933387 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/114241a2-fe6b-43a6-957c-d215ce55737a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.936695 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.940970 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pll8b\" (UniqueName: \"kubernetes.io/projected/114241a2-fe6b-43a6-957c-d215ce55737a-kube-api-access-pll8b\") pod \"ovnkube-control-plane-749d76644c-h648m\" (UID: \"114241a2-fe6b-43a6-957c-d215ce55737a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.949211 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.958799 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.958951 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.958967 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.958982 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.958991 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.965455 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: E1203 12:15:14.971425 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.974051 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.974205 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.974316 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.974419 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.974544 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.977030 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: E1203 12:15:14.985698 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.990260 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.990304 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.990316 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.990335 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4711]: I1203 12:15:14.990346 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: E1203 12:15:15.001334 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.005438 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.005476 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.005487 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.005504 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.005515 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: E1203 12:15:15.016280 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.019334 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.019361 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.019371 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.019385 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.019395 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: E1203 12:15:15.030086 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: E1203 12:15:15.030405 4711 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.032336 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.032464 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.032566 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.032651 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.032762 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.062661 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/1.log" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.063191 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/0.log" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.065203 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4" exitCode=1 Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.065244 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.065294 4711 scope.go:117] "RemoveContainer" containerID="c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.065866 4711 scope.go:117] "RemoveContainer" containerID="2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4" Dec 03 12:15:15 crc kubenswrapper[4711]: E1203 12:15:15.066054 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.079037 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.085819 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.089808 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: W1203 12:15:15.097855 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114241a2_fe6b_43a6_957c_d215ce55737a.slice/crio-deeb1588bb064c8c0c83517b6ef9e5e9da440be3c0684518de333e4257bfd06f WatchSource:0}: Error finding container deeb1588bb064c8c0c83517b6ef9e5e9da440be3c0684518de333e4257bfd06f: Status 404 returned error can't find the container with id deeb1588bb064c8c0c83517b6ef9e5e9da440be3c0684518de333e4257bfd06f Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.101089 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.117442 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.132605 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.135170 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.135214 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.135226 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.135244 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.135254 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.143031 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.154497 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.166402 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.179062 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.192446 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.202539 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.213432 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.224045 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.233846 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.238328 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.238352 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.238361 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.238374 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.238383 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.249834 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76600ba3586e77ab51715123f39b4904e61ad69c3c5976e0bc50e7008a47d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"message\\\":\\\"rvice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622262 6000 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622525 6000 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622583 6000 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622687 6000 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:12.622715 6000 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:12.623100 6000 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 12:15:12.623115 6000 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 12:15:12.623152 6000 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:12.623170 6000 factory.go:656] Stopping watch factory\\\\nI1203 12:15:12.623180 6000 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 12:15:12.623187 6000 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:12.623189 6000 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:15Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.340729 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.340779 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.340797 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.340823 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.340842 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.444142 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.444193 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.444209 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.444237 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.444254 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.546650 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.546688 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.546696 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.546711 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.546721 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.649674 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.649724 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.649741 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.649762 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.649778 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.752709 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.752761 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.752779 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.752804 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.752821 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.855158 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.855199 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.855211 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.855228 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.855240 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.957903 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.958023 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.958048 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.958077 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4711]: I1203 12:15:15.958102 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.060976 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.061065 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.061089 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.061120 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.061144 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.070348 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/1.log" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.074630 4711 scope.go:117] "RemoveContainer" containerID="2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4" Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.074816 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.075696 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" event={"ID":"114241a2-fe6b-43a6-957c-d215ce55737a","Type":"ContainerStarted","Data":"458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.075737 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" event={"ID":"114241a2-fe6b-43a6-957c-d215ce55737a","Type":"ContainerStarted","Data":"b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.075756 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" event={"ID":"114241a2-fe6b-43a6-957c-d215ce55737a","Type":"ContainerStarted","Data":"deeb1588bb064c8c0c83517b6ef9e5e9da440be3c0684518de333e4257bfd06f"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.089279 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.103270 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.115935 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.127119 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.142281 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.152779 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.164330 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.164369 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.164378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.164395 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.164405 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.182728 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.201544 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.219878 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.236667 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.257666 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.266385 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.266464 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.266485 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.266518 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.266537 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.275753 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.299253 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.315950 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.330820 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.353040 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.368206 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.369627 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.369687 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.369703 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.369725 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.369740 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.383193 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.400413 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.417954 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.434590 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.452657 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.466020 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.473281 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.473323 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.473332 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.473350 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.473360 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.481680 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.497536 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.512300 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.526996 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.538609 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.543549 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.543771 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:32.543741024 +0000 UTC m=+51.212992289 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.551239 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.568533 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.576126 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.576179 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.576190 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.576208 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.576222 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.597063 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wd9tz"] Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.597855 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.597994 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.612627 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.628967 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.644953 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.645018 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s66kq\" (UniqueName: \"kubernetes.io/projected/cdb7f01e-b2fd-49da-b7de-621da238d797-kube-api-access-s66kq\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.645051 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.645075 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.645100 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645182 4711 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645233 4711 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.645277 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645309 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:32.645284501 +0000 UTC m=+51.314535766 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645331 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:32.645321312 +0000 UTC m=+51.314572577 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645330 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645382 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645396 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645413 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645427 4711 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645471 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:32.645455485 +0000 UTC m=+51.314706760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645397 4711 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.645537 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:32.645518337 +0000 UTC m=+51.314769672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.648401 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.665369 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.678138 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.678182 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.678193 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.678211 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.678223 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.680159 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.692663 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.713174 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.726411 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.739965 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.745949 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.746021 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s66kq\" (UniqueName: \"kubernetes.io/projected/cdb7f01e-b2fd-49da-b7de-621da238d797-kube-api-access-s66kq\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.746212 4711 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.746356 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs podName:cdb7f01e-b2fd-49da-b7de-621da238d797 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:17.246318054 +0000 UTC m=+35.915569349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs") pod "network-metrics-daemon-wd9tz" (UID: "cdb7f01e-b2fd-49da-b7de-621da238d797") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.759204 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.765839 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s66kq\" (UniqueName: \"kubernetes.io/projected/cdb7f01e-b2fd-49da-b7de-621da238d797-kube-api-access-s66kq\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.776233 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.780774 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.780820 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.780832 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.780850 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.780863 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.794609 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.808961 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.816547 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.816586 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.816558 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.816673 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.816766 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:16 crc kubenswrapper[4711]: E1203 12:15:16.816938 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.821320 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.835807 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.851822 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:16Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.883668 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.883719 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.883737 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.883760 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.883778 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.986203 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.986241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.986250 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.986289 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4711]: I1203 12:15:16.986300 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.089136 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.089199 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.089218 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.089243 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.089261 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.192241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.192307 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.192319 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.192335 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.192345 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.251397 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:17 crc kubenswrapper[4711]: E1203 12:15:17.251610 4711 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:17 crc kubenswrapper[4711]: E1203 12:15:17.251731 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs podName:cdb7f01e-b2fd-49da-b7de-621da238d797 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:18.251703295 +0000 UTC m=+36.920954590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs") pod "network-metrics-daemon-wd9tz" (UID: "cdb7f01e-b2fd-49da-b7de-621da238d797") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.296010 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.296060 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.296077 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.296101 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.296121 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.398714 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.398747 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.398755 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.398767 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.398776 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.501569 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.501999 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.502238 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.502398 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.502582 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.606067 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.606473 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.606679 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.606875 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.607113 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.709826 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.709870 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.709879 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.709945 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.709956 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.818167 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.818400 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.818636 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.818649 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.818665 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.818677 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:17 crc kubenswrapper[4711]: E1203 12:15:17.819382 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.921263 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.921529 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.921625 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.921727 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:17 crc kubenswrapper[4711]: I1203 12:15:17.921816 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:17Z","lastTransitionTime":"2025-12-03T12:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.024310 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.024777 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.025040 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.025215 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.025379 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.128145 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.128184 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.128192 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.128206 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.128219 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.230529 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.230798 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.230809 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.230826 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.230837 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.263217 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:18 crc kubenswrapper[4711]: E1203 12:15:18.263363 4711 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:18 crc kubenswrapper[4711]: E1203 12:15:18.263415 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs podName:cdb7f01e-b2fd-49da-b7de-621da238d797 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:20.263397673 +0000 UTC m=+38.932648938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs") pod "network-metrics-daemon-wd9tz" (UID: "cdb7f01e-b2fd-49da-b7de-621da238d797") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.332749 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.332779 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.332787 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.332799 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.332807 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.435704 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.435777 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.435801 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.435828 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.435847 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.539230 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.539291 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.539308 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.539331 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.539347 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.643086 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.643152 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.643177 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.643209 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.643233 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.746199 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.746249 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.746265 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.746288 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.746304 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.817065 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.817114 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.817072 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:18 crc kubenswrapper[4711]: E1203 12:15:18.817225 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:18 crc kubenswrapper[4711]: E1203 12:15:18.817302 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:18 crc kubenswrapper[4711]: E1203 12:15:18.817363 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.849017 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.849064 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.849079 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.849099 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.849114 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.952116 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.952190 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.952208 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.952236 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:18 crc kubenswrapper[4711]: I1203 12:15:18.952258 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:18Z","lastTransitionTime":"2025-12-03T12:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.055183 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.055246 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.055265 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.055296 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.055318 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.159042 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.159142 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.159175 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.159205 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.159225 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.161250 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.162648 4711 scope.go:117] "RemoveContainer" containerID="2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4" Dec 03 12:15:19 crc kubenswrapper[4711]: E1203 12:15:19.163083 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.261324 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.261361 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.261373 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.261388 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.261400 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.363277 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.363314 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.363330 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.363348 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.363360 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.466554 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.466624 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.466655 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.466682 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.466700 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.570380 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.570448 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.570472 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.570497 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.570513 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.602653 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.615709 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.629426 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.651114 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.663064 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.673359 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.673399 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.673407 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.673422 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.673431 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.680522 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.695993 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.713219 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.726671 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.742406 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.756593 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.770461 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.775698 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.775744 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.775763 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.775787 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.775806 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.788114 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.802566 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.816301 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:19 crc kubenswrapper[4711]: E1203 12:15:19.816596 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.820405 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.833700 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.850436 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.878440 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.878490 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.878510 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.878536 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.878554 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.980554 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.980603 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.980619 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.980643 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:19 crc kubenswrapper[4711]: I1203 12:15:19.980661 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:19Z","lastTransitionTime":"2025-12-03T12:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.083973 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.084028 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.084047 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.084066 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.084079 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.186985 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.187117 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.187139 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.187161 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.187214 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.290714 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.290769 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.290786 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.290809 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.290827 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.311324 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:20 crc kubenswrapper[4711]: E1203 12:15:20.311463 4711 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:20 crc kubenswrapper[4711]: E1203 12:15:20.311515 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs podName:cdb7f01e-b2fd-49da-b7de-621da238d797 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:24.311501249 +0000 UTC m=+42.980752494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs") pod "network-metrics-daemon-wd9tz" (UID: "cdb7f01e-b2fd-49da-b7de-621da238d797") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.393607 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.393735 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.393757 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.393780 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.393800 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.497467 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.497549 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.497573 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.497602 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.497624 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.600804 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.600864 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.600886 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.600953 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.600981 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.703714 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.703865 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.703950 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.703981 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.704002 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.808067 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.808152 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.808177 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.808208 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.808232 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.816631 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.816689 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.816742 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:20 crc kubenswrapper[4711]: E1203 12:15:20.816935 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:20 crc kubenswrapper[4711]: E1203 12:15:20.817029 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:20 crc kubenswrapper[4711]: E1203 12:15:20.817182 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.910880 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.910979 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.910999 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.911029 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:20 crc kubenswrapper[4711]: I1203 12:15:20.911049 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:20Z","lastTransitionTime":"2025-12-03T12:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.014158 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.014230 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.014241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.014258 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.014270 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.117450 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.117545 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.117566 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.117597 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.117627 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.220423 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.220693 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.220713 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.220731 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.220743 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.322653 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.322694 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.322706 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.322721 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.322733 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.427846 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.427950 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.427970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.427995 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.428020 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.531491 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.531565 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.531582 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.531617 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.531635 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.634248 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.634322 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.634344 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.634371 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.634393 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.736726 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.736801 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.736823 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.736855 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.736879 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.817069 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:21 crc kubenswrapper[4711]: E1203 12:15:21.817198 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.837215 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.839976 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.840011 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.840020 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.840034 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.840043 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.855415 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.870111 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.881124 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.898340 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.920976 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.939332 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.941980 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.942014 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.942027 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.942043 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.942059 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:21Z","lastTransitionTime":"2025-12-03T12:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.953556 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.966213 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.978563 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:21 crc kubenswrapper[4711]: I1203 12:15:21.998867 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.013540 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:22Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.032418 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:22Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.045682 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.045777 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.045792 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.045822 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.045843 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.049167 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:22Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.064642 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:22Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.079767 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:22Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.148838 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.148926 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.148940 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.148956 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.148966 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.251554 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.251604 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.251614 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.251639 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.251650 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.354237 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.354286 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.354300 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.354317 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.354334 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.456820 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.456862 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.456875 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.456891 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.456901 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.558920 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.558959 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.558970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.558985 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.558997 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.661559 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.661611 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.661624 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.661641 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.661656 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.764525 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.764588 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.764606 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.764630 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.764647 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.817169 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.817282 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.817199 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:22 crc kubenswrapper[4711]: E1203 12:15:22.817393 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:22 crc kubenswrapper[4711]: E1203 12:15:22.817446 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:22 crc kubenswrapper[4711]: E1203 12:15:22.817589 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.867178 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.867250 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.867261 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.867281 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.867293 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.970613 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.970686 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.970700 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.970722 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:22 crc kubenswrapper[4711]: I1203 12:15:22.970735 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:22Z","lastTransitionTime":"2025-12-03T12:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.073567 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.073614 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.073624 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.073641 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.073652 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.176767 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.176818 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.176826 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.176839 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.176868 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.279866 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.279995 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.280029 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.280057 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.280078 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.382836 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.382938 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.382959 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.382977 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.382987 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.486404 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.486453 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.486462 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.486493 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.486511 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.590313 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.590377 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.590388 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.590405 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.590415 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.693644 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.693709 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.693726 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.693752 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.693770 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.797021 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.797074 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.797090 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.797116 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.797133 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.816637 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:23 crc kubenswrapper[4711]: E1203 12:15:23.816769 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.900016 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.900077 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.900094 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.900117 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:23 crc kubenswrapper[4711]: I1203 12:15:23.900132 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:23Z","lastTransitionTime":"2025-12-03T12:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.003239 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.003363 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.003375 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.003391 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.003402 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.105564 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.105600 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.105610 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.105625 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.105637 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.207898 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.207963 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.207972 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.207988 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.208000 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.311818 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.311925 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.311945 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.311967 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.311982 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.381767 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:24 crc kubenswrapper[4711]: E1203 12:15:24.382010 4711 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:24 crc kubenswrapper[4711]: E1203 12:15:24.382119 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs podName:cdb7f01e-b2fd-49da-b7de-621da238d797 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:32.382089059 +0000 UTC m=+51.051340354 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs") pod "network-metrics-daemon-wd9tz" (UID: "cdb7f01e-b2fd-49da-b7de-621da238d797") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.419227 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.419295 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.419318 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.419351 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.419376 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.522665 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.522730 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.522741 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.522766 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.522782 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.625191 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.625228 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.625269 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.625286 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.625295 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.728322 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.728367 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.728379 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.728398 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.728407 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.816280 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.816280 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:24 crc kubenswrapper[4711]: E1203 12:15:24.816498 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.816327 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:24 crc kubenswrapper[4711]: E1203 12:15:24.816571 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:24 crc kubenswrapper[4711]: E1203 12:15:24.816716 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.830573 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.830650 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.830672 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.830702 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.830723 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.933384 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.933445 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.933463 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.933488 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:24 crc kubenswrapper[4711]: I1203 12:15:24.933504 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:24Z","lastTransitionTime":"2025-12-03T12:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.035931 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.035974 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.035985 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.036024 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.036039 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.138661 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.138703 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.138712 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.138728 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.138737 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.242350 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.242402 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.242548 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.242582 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.242599 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.345737 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.345804 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.345815 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.345834 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.345847 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.412496 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.412579 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.412598 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.412620 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.412669 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: E1203 12:15:25.426160 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.430502 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.430575 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.430595 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.430617 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.430635 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: E1203 12:15:25.446325 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.451498 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.451530 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.451541 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.451555 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.451566 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: E1203 12:15:25.468511 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.472269 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.472318 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.472335 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.472357 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.472373 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: E1203 12:15:25.486068 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.489996 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.490052 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.490070 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.490108 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.490127 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: E1203 12:15:25.506412 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:25 crc kubenswrapper[4711]: E1203 12:15:25.506656 4711 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.509322 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.509399 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.509419 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.509444 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.509464 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.612782 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.612846 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.612870 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.612900 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.612976 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.715476 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.715510 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.715518 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.715531 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.715541 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.816760 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:25 crc kubenswrapper[4711]: E1203 12:15:25.816922 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.822075 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.822119 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.822145 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.822165 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.822179 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.925499 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.925569 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.925588 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.925613 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:25 crc kubenswrapper[4711]: I1203 12:15:25.925631 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:25Z","lastTransitionTime":"2025-12-03T12:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.027282 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.027309 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.027317 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.027329 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.027338 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.129438 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.129478 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.129485 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.129500 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.129508 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.231449 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.231511 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.231544 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.231582 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.231606 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.334178 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.334250 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.334275 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.334305 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.334326 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.437621 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.437698 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.437715 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.437740 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.437756 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.540491 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.540542 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.540555 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.540575 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.540588 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.643035 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.643078 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.643092 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.643111 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.643124 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.745967 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.746027 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.746045 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.746068 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.746084 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.817073 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.817073 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:26 crc kubenswrapper[4711]: E1203 12:15:26.817215 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:26 crc kubenswrapper[4711]: E1203 12:15:26.817301 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.817823 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:26 crc kubenswrapper[4711]: E1203 12:15:26.818058 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.848272 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.848332 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.848340 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.848357 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.848367 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.950858 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.950903 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.950932 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.950950 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:26 crc kubenswrapper[4711]: I1203 12:15:26.950961 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:26Z","lastTransitionTime":"2025-12-03T12:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.054149 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.054263 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.054276 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.054306 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.054324 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.157333 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.157373 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.157381 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.157398 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.157411 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.259749 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.259816 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.259827 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.259848 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.259861 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.362756 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.362804 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.362822 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.362839 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.362853 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.465802 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.465878 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.465892 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.465963 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.465982 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.569049 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.569105 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.569121 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.569145 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.569163 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.672282 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.672346 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.672364 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.672389 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.672409 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.775138 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.775202 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.775223 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.775248 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.775266 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.816650 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:27 crc kubenswrapper[4711]: E1203 12:15:27.816800 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.877497 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.877540 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.877549 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.877563 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.877573 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.980787 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.980842 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.980859 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.980881 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:27 crc kubenswrapper[4711]: I1203 12:15:27.980898 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:27Z","lastTransitionTime":"2025-12-03T12:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.083343 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.083388 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.083398 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.083415 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.083426 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.186627 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.186692 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.186713 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.186741 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.186762 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.290404 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.290449 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.290462 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.290480 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.290493 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.394136 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.394204 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.394221 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.394248 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.394265 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.497169 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.497233 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.497245 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.497265 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.497276 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.600110 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.600153 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.600164 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.600184 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.600195 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.703128 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.703175 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.703193 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.703215 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.703232 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.806033 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.806102 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.806117 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.806134 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.806145 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.817242 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.817280 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.817316 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:28 crc kubenswrapper[4711]: E1203 12:15:28.817440 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:28 crc kubenswrapper[4711]: E1203 12:15:28.817663 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:28 crc kubenswrapper[4711]: E1203 12:15:28.817858 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.909287 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.909390 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.909440 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.909467 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:28 crc kubenswrapper[4711]: I1203 12:15:28.909484 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:28Z","lastTransitionTime":"2025-12-03T12:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.013111 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.013169 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.013180 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.013211 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.013220 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.114851 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.114928 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.114952 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.114970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.114982 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.218769 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.219071 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.219090 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.219115 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.219135 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.322234 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.322315 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.322343 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.322375 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.322394 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.424721 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.424779 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.424792 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.424811 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.424825 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.527959 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.528031 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.528055 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.528102 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.528128 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.629670 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.629708 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.629719 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.629735 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.629748 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.732761 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.732799 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.732808 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.732826 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.732837 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.817031 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:29 crc kubenswrapper[4711]: E1203 12:15:29.817251 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.835881 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.836012 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.836038 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.836076 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.836110 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.938884 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.938951 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.938964 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.938981 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:29 crc kubenswrapper[4711]: I1203 12:15:29.938992 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:29Z","lastTransitionTime":"2025-12-03T12:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.042383 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.042457 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.042481 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.042563 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.042593 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.144413 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.144488 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.144510 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.144541 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.144561 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.246741 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.246789 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.246807 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.246822 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.246833 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.348849 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.348892 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.348901 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.348931 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.348940 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.452179 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.452289 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.452359 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.452390 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.452407 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.555198 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.555287 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.555312 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.555344 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.555370 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.658643 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.658708 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.658728 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.658753 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.658771 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.762144 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.762205 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.762239 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.762264 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.762279 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.816941 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.817024 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:30 crc kubenswrapper[4711]: E1203 12:15:30.817105 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:30 crc kubenswrapper[4711]: E1203 12:15:30.817198 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.817329 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:30 crc kubenswrapper[4711]: E1203 12:15:30.817448 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.864879 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.864950 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.864966 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.864986 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.865000 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.968271 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.968322 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.968339 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.968364 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:30 crc kubenswrapper[4711]: I1203 12:15:30.968380 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:30Z","lastTransitionTime":"2025-12-03T12:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.070830 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.070960 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.070997 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.071028 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.071049 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.174060 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.174106 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.174117 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.174134 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.174145 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.277297 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.277378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.277401 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.277430 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.277455 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.380343 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.380399 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.380410 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.380425 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.380437 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.482851 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.482967 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.482987 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.483010 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.483027 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.586046 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.586112 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.586133 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.586160 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.586182 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.689272 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.689314 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.689325 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.689340 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.689352 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.791895 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.791965 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.791976 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.791996 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.792007 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.817825 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:31 crc kubenswrapper[4711]: E1203 12:15:31.818080 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.836798 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.859991 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.883284 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.894864 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.894954 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.894979 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.895057 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.895171 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.917025 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.934409 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.949407 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.969465 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.995480 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.999195 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.999270 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.999297 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.999324 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:31 crc kubenswrapper[4711]: I1203 12:15:31.999341 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:31Z","lastTransitionTime":"2025-12-03T12:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.019466 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:32Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.038558 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:32Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.052585 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:32Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.075145 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:32Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.089887 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:32Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.102627 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.102711 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.102737 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.102772 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.102795 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.105793 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:32Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.129300 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:32Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.142780 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:32Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.205598 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.205660 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.205684 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.205714 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.205733 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.308286 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.308326 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.308337 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.308354 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.308366 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.411623 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.411675 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.411687 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.411703 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.411716 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.470572 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.470817 4711 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.470940 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs podName:cdb7f01e-b2fd-49da-b7de-621da238d797 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:48.470881247 +0000 UTC m=+67.140132542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs") pod "network-metrics-daemon-wd9tz" (UID: "cdb7f01e-b2fd-49da-b7de-621da238d797") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.514486 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.514523 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.514536 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.514553 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.514566 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.571710 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.571941 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.5718982 +0000 UTC m=+83.241149485 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.617616 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.617696 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.617722 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.617745 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.617765 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.672798 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.672877 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.673038 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.673125 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673051 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673189 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673204 4711 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673050 4711 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673305 4711 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673287 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.673261372 +0000 UTC m=+83.342512687 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673350 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673400 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673427 4711 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673366 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.673348374 +0000 UTC m=+83.342599749 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673497 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.673481057 +0000 UTC m=+83.342732422 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.673520 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.673508498 +0000 UTC m=+83.342759883 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.720878 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.720953 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.720970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.720996 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.721013 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.817000 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.817062 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.817326 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.817359 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.818171 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:32 crc kubenswrapper[4711]: E1203 12:15:32.818354 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.818608 4711 scope.go:117] "RemoveContainer" containerID="2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.822685 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.822749 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.822764 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.822784 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.822796 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.924889 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.925199 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.925211 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.925225 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:32 crc kubenswrapper[4711]: I1203 12:15:32.925235 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:32Z","lastTransitionTime":"2025-12-03T12:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.027788 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.027823 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.027834 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.027849 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.027860 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.129188 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/1.log" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.129407 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.129427 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.129435 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.129448 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.129459 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.131340 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.131708 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.145287 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.159968 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.171926 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.183998 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.198523 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.214724 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.228093 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.230985 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.231013 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.231022 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.231035 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.231044 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.239603 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.248288 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.258706 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.267817 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.290371 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.307618 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.324213 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.333902 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.333949 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.333960 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.333977 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.333988 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.339892 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.354051 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.436767 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.436831 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.436848 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.436874 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.436891 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.539379 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.539435 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.539452 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.539476 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.539493 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.641411 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.641458 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.641470 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.641491 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.641504 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.743978 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.744040 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.744065 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.744100 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.744122 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.816497 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:33 crc kubenswrapper[4711]: E1203 12:15:33.816703 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.845988 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.846206 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.846273 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.846361 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.846425 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.949770 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.950032 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.950180 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.950284 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:33 crc kubenswrapper[4711]: I1203 12:15:33.950361 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:33Z","lastTransitionTime":"2025-12-03T12:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.052827 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.052874 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.052886 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.052924 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.052936 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.137787 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/2.log" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.139233 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/1.log" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.143493 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3" exitCode=1 Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.143547 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.143598 4711 scope.go:117] "RemoveContainer" containerID="2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.145061 4711 scope.go:117] "RemoveContainer" containerID="f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3" Dec 03 12:15:34 crc kubenswrapper[4711]: E1203 12:15:34.145325 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.156612 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.156663 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.156678 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.156732 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.156751 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.159591 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.171855 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.185439 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.199339 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.211775 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.226735 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.241439 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.251969 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.259022 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.259061 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.259072 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.259087 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.259097 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.265239 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.276903 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.297838 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d6c75c0758bd56511acab8ad071513808bad9827e81f82e91cfe549fffacbe4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"message\\\":\\\"or removal\\\\nI1203 12:15:13.907419 6128 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:13.907429 6128 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:13.907431 6128 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907458 6128 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 12:15:13.907472 6128 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 12:15:13.907481 6128 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 12:15:13.907489 6128 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 12:15:13.907513 6128 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 12:15:13.907685 6128 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:13.907730 6128 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:13.907902 6128 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 12:15:13.907957 6128 factory.go:656] Stopping watch factory\\\\nI1203 12:15:13.907973 6128 ovnkube.go:599] Stopped ovnkube\\\\nI1203 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.312321 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.324206 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.335859 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.349857 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.361002 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.361032 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.361041 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.361054 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.361066 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.365751 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:34Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.464062 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.464112 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.464124 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.464148 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.464160 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.565753 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.565819 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.565831 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.565844 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.565853 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.668493 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.668575 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.668601 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.668629 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.668652 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.771381 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.771446 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.771458 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.771481 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.771498 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.817175 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.817245 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:34 crc kubenswrapper[4711]: E1203 12:15:34.817351 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:34 crc kubenswrapper[4711]: E1203 12:15:34.817480 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.817861 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:34 crc kubenswrapper[4711]: E1203 12:15:34.818301 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.873631 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.873684 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.873708 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.873732 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.873749 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.976282 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.976327 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.976341 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.976359 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:34 crc kubenswrapper[4711]: I1203 12:15:34.976371 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:34Z","lastTransitionTime":"2025-12-03T12:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.079045 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.079099 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.079111 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.079130 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.079143 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.148787 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/2.log" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.152825 4711 scope.go:117] "RemoveContainer" containerID="f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3" Dec 03 12:15:35 crc kubenswrapper[4711]: E1203 12:15:35.153115 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.165648 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.181770 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.181807 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.181815 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.181830 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.181839 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.183448 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.198578 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.215711 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.241662 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.255634 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.267009 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.282328 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.284283 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.284351 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.284376 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.284402 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.284421 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.300516 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.318472 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.338698 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.355411 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.368187 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.386433 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.386496 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.386513 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.386536 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.386554 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.390269 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.408633 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.422004 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.489771 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.489841 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.489861 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.489892 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.489939 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.575756 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.575835 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.575849 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.575875 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.575931 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: E1203 12:15:35.594937 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.600412 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.600492 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.600518 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.600550 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.600573 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: E1203 12:15:35.619453 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.624236 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.624284 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.624301 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.624328 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.624347 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: E1203 12:15:35.644395 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.649843 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.649902 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.649972 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.650003 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.650029 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: E1203 12:15:35.671291 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.680328 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.680368 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.680378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.680396 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.680407 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: E1203 12:15:35.694865 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:35 crc kubenswrapper[4711]: E1203 12:15:35.695146 4711 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.697183 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.697268 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.697287 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.697319 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.697338 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.799952 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.800017 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.800042 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.800071 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.800096 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.816980 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:35 crc kubenswrapper[4711]: E1203 12:15:35.817199 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.902392 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.902630 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.902697 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.902760 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:35 crc kubenswrapper[4711]: I1203 12:15:35.902817 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:35Z","lastTransitionTime":"2025-12-03T12:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.005240 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.005529 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.005595 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.005681 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.005742 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.108140 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.108178 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.108187 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.108204 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.108213 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.211163 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.211205 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.211217 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.211233 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.211245 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.224090 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.237899 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.240830 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.259043 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.274453 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.298816 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.313973 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.314006 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.314017 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.314036 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.314049 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.315408 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.337379 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.353450 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.367971 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.382672 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.397789 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.408134 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.415988 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.416042 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.416063 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.416092 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.416116 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.420623 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.432000 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.445017 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.454366 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.463817 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:36Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.519152 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.519193 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.519206 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.519402 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.519421 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.621543 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.621578 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.621589 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.621609 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.621622 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.724235 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.724285 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.724296 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.724311 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.724324 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.816236 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.816236 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:36 crc kubenswrapper[4711]: E1203 12:15:36.816371 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:36 crc kubenswrapper[4711]: E1203 12:15:36.816425 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.816236 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:36 crc kubenswrapper[4711]: E1203 12:15:36.816502 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.826831 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.826857 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.826865 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.826899 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.826939 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.929293 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.929321 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.929330 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.929345 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:36 crc kubenswrapper[4711]: I1203 12:15:36.929356 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:36Z","lastTransitionTime":"2025-12-03T12:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.032174 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.032214 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.032225 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.032241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.032253 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.134164 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.134200 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.134210 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.134226 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.134237 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.236244 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.236282 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.236294 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.236312 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.236325 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.338891 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.338943 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.338954 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.338968 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.338978 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.440993 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.441041 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.441051 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.441069 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.441080 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.544101 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.544132 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.544141 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.544154 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.544162 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.647569 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.647625 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.647637 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.647660 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.647673 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.750015 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.750065 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.750081 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.750106 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.750122 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.816606 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:37 crc kubenswrapper[4711]: E1203 12:15:37.816768 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.852230 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.852275 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.852286 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.852305 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.852316 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.955166 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.955207 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.955216 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.955246 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:37 crc kubenswrapper[4711]: I1203 12:15:37.955256 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:37Z","lastTransitionTime":"2025-12-03T12:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.057518 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.057559 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.057571 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.057586 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.057600 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.159267 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.159318 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.159328 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.159341 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.159351 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.261947 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.262247 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.262343 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.262469 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.262568 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.365184 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.365381 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.365491 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.365561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.365622 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.468370 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.468437 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.468461 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.468514 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.468535 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.571692 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.571750 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.571767 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.571791 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.571810 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.673752 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.674099 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.674280 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.674524 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.674720 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.776759 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.776817 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.776836 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.776857 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.776872 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.816526 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:38 crc kubenswrapper[4711]: E1203 12:15:38.816846 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.816618 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:38 crc kubenswrapper[4711]: E1203 12:15:38.817180 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.816578 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:38 crc kubenswrapper[4711]: E1203 12:15:38.817407 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.879409 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.879447 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.879459 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.879475 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.879488 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.983845 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.984039 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.984068 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.984121 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:38 crc kubenswrapper[4711]: I1203 12:15:38.984142 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:38Z","lastTransitionTime":"2025-12-03T12:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.086748 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.086788 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.086800 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.086816 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.086827 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.188489 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.188535 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.188545 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.188561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.188570 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.291000 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.291036 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.291047 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.291063 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.291075 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.394733 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.394824 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.394842 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.394866 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.394883 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.497811 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.497951 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.498032 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.498074 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.498118 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.601343 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.601394 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.601407 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.601426 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.601439 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.705293 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.705350 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.705367 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.705389 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.705406 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.807952 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.807991 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.808001 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.808020 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.808031 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.816809 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:39 crc kubenswrapper[4711]: E1203 12:15:39.817078 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.910729 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.910769 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.910781 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.910799 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:39 crc kubenswrapper[4711]: I1203 12:15:39.910814 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:39Z","lastTransitionTime":"2025-12-03T12:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.013653 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.013704 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.013718 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.013741 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.013756 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.116310 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.116370 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.116386 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.116409 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.116426 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.219288 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.219343 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.219362 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.219384 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.219400 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.322128 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.322185 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.322202 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.322222 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.322237 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.425849 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.425971 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.425991 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.426016 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.426038 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.528886 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.528976 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.528988 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.529009 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.529026 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.631771 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.631826 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.631854 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.631872 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.631884 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.734242 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.734271 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.734279 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.734292 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.734301 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.817027 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:40 crc kubenswrapper[4711]: E1203 12:15:40.817158 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.817324 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.817325 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:40 crc kubenswrapper[4711]: E1203 12:15:40.817381 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:40 crc kubenswrapper[4711]: E1203 12:15:40.817470 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.841612 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.841651 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.841665 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.841690 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.841702 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.944861 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.944944 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.944960 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.944979 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:40 crc kubenswrapper[4711]: I1203 12:15:40.944992 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:40Z","lastTransitionTime":"2025-12-03T12:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.048133 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.048260 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.048285 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.048343 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.048366 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.150558 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.150621 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.150636 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.150658 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.150675 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.253485 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.253551 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.253573 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.253600 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.253622 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.356174 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.356226 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.356245 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.356266 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.356282 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.459175 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.459230 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.459247 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.459268 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.459288 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.562361 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.562407 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.562415 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.562430 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.562441 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.664771 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.664827 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.664838 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.664856 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.664867 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.769788 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.769834 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.769846 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.769870 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.769883 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.816821 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:41 crc kubenswrapper[4711]: E1203 12:15:41.817135 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.831108 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.842411 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.855312 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.873223 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.873295 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.873309 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.873329 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.873342 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.874123 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.886024 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.901197 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.917359 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.930666 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.946081 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.961629 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.975969 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.976054 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.976077 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.976112 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.976136 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:41Z","lastTransitionTime":"2025-12-03T12:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.977898 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.988285 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:41 crc kubenswrapper[4711]: I1203 12:15:41.999274 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.013324 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:42Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.026944 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:42Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.038592 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:42Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.051647 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:42Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.079062 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.079106 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.079117 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.079157 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.079169 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.181750 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.181783 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.181792 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.181804 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.181812 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.283790 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.283821 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.283829 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.283841 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.283850 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.386014 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.386041 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.386049 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.386060 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.386068 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.488281 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.488335 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.488350 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.488369 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.488385 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.590595 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.590667 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.590701 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.590730 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.590751 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.693917 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.693953 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.693961 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.693974 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.693983 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.797445 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.797495 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.797507 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.797526 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.797538 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.816816 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.816842 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.816816 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:42 crc kubenswrapper[4711]: E1203 12:15:42.816962 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:42 crc kubenswrapper[4711]: E1203 12:15:42.817109 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:42 crc kubenswrapper[4711]: E1203 12:15:42.817167 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.899835 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.899888 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.899897 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.899945 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:42 crc kubenswrapper[4711]: I1203 12:15:42.899962 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:42Z","lastTransitionTime":"2025-12-03T12:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.002876 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.002965 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.002985 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.003009 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.003028 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.105424 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.105478 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.105489 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.105506 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.105517 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.208031 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.208126 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.208146 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.208173 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.208190 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.310460 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.310498 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.310506 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.310518 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.310527 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.412895 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.412954 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.412962 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.412976 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.412985 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.516114 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.516467 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.516674 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.516840 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.517039 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.619886 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.619958 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.619970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.619986 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.619997 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.722884 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.722941 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.722950 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.722964 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.722972 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.816584 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:43 crc kubenswrapper[4711]: E1203 12:15:43.817173 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.825138 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.825183 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.825199 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.825223 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.825240 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.928434 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.928487 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.928511 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.928534 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:43 crc kubenswrapper[4711]: I1203 12:15:43.928549 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:43Z","lastTransitionTime":"2025-12-03T12:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.032122 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.032161 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.032174 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.032189 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.032197 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.135609 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.135682 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.135696 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.135718 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.135730 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.238494 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.238570 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.238585 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.238601 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.238613 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.341217 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.341271 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.341286 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.341307 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.341322 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.444951 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.445215 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.445357 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.445472 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.445592 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.549128 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.549441 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.549591 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.549747 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.549886 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.652307 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.652342 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.652353 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.652369 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.652380 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.754841 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.755261 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.755459 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.755656 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.755851 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.817008 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.817070 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.817016 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:44 crc kubenswrapper[4711]: E1203 12:15:44.817219 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:44 crc kubenswrapper[4711]: E1203 12:15:44.817357 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:44 crc kubenswrapper[4711]: E1203 12:15:44.817516 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.859193 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.859396 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.859524 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.859649 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.859795 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.963079 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.963469 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.963603 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.963739 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:44 crc kubenswrapper[4711]: I1203 12:15:44.963884 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:44Z","lastTransitionTime":"2025-12-03T12:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.067180 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.067241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.067253 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.067273 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.067285 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.169816 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.169885 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.169896 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.169937 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.169948 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.272822 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.272853 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.272860 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.272873 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.272883 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.374801 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.375119 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.375244 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.375413 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.375555 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.478393 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.478457 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.478466 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.478482 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.478492 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.581648 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.581757 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.581779 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.581847 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.581867 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.684460 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.684528 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.684550 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.684586 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.684618 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.787275 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.787318 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.787329 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.787346 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.787357 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.817068 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:45 crc kubenswrapper[4711]: E1203 12:15:45.817221 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.875635 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.875702 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.875714 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.875732 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.875743 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: E1203 12:15:45.892693 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.897551 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.897619 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.897645 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.897680 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.897717 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: E1203 12:15:45.915129 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.919620 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.919717 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.919735 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.919793 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.919814 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: E1203 12:15:45.939931 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.944401 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.944446 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.944457 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.944475 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.944488 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: E1203 12:15:45.957416 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.962583 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.962616 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.962627 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.962642 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.962654 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:45 crc kubenswrapper[4711]: E1203 12:15:45.980291 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:45 crc kubenswrapper[4711]: E1203 12:15:45.980433 4711 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.982641 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.982674 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.982685 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.982698 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:45 crc kubenswrapper[4711]: I1203 12:15:45.982709 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:45Z","lastTransitionTime":"2025-12-03T12:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.086273 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.086335 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.086352 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.086379 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.086398 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.189764 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.189838 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.189859 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.189888 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.189952 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.293431 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.293497 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.293516 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.293541 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.293558 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.395977 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.396016 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.396026 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.396042 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.396054 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.505790 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.507011 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.507050 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.507078 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.507126 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.610345 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.610409 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.610432 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.610461 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.610482 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.713390 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.713468 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.713483 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.713506 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.713521 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.816497 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.816562 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.816597 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:46 crc kubenswrapper[4711]: E1203 12:15:46.816702 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:46 crc kubenswrapper[4711]: E1203 12:15:46.816863 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:46 crc kubenswrapper[4711]: E1203 12:15:46.816968 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.817214 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.817250 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.817262 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.817285 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.817305 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.919799 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.919898 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.919985 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.920021 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:46 crc kubenswrapper[4711]: I1203 12:15:46.920040 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:46Z","lastTransitionTime":"2025-12-03T12:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.022949 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.023029 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.023052 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.023080 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.023097 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.126848 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.126897 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.126930 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.126949 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.126962 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.230152 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.230200 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.230221 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.230241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.230254 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.333087 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.333155 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.333193 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.333225 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.333247 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.436389 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.436466 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.436499 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.436528 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.436551 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.539853 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.539947 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.539970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.539997 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.540019 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.642809 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.642852 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.642863 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.642881 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.642894 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.745792 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.745836 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.745844 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.745858 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.745868 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.816735 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:47 crc kubenswrapper[4711]: E1203 12:15:47.816958 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.849084 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.849147 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.849165 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.849188 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.849205 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.952284 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.952326 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.952337 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.952356 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:47 crc kubenswrapper[4711]: I1203 12:15:47.952368 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:47Z","lastTransitionTime":"2025-12-03T12:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.057347 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.057415 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.057428 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.057448 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.057460 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.159665 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.159732 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.159743 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.159782 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.159796 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.262153 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.262212 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.262233 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.262261 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.262283 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.364750 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.364821 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.364833 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.364849 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.364880 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.467463 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.468305 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.468368 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.468403 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.468426 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.551027 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:48 crc kubenswrapper[4711]: E1203 12:15:48.551217 4711 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:48 crc kubenswrapper[4711]: E1203 12:15:48.551300 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs podName:cdb7f01e-b2fd-49da-b7de-621da238d797 nodeName:}" failed. No retries permitted until 2025-12-03 12:16:20.551279426 +0000 UTC m=+99.220530721 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs") pod "network-metrics-daemon-wd9tz" (UID: "cdb7f01e-b2fd-49da-b7de-621da238d797") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.571375 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.571418 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.571430 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.571448 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.571459 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.673520 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.673562 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.673571 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.673584 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.673597 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.775538 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.775585 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.775600 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.775616 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.775628 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.816467 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.816487 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:48 crc kubenswrapper[4711]: E1203 12:15:48.816617 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.816761 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:48 crc kubenswrapper[4711]: E1203 12:15:48.816881 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:48 crc kubenswrapper[4711]: E1203 12:15:48.817102 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.827314 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.878789 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.878846 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.878865 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.878887 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.878903 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.981706 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.981964 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.982000 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.982033 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:48 crc kubenswrapper[4711]: I1203 12:15:48.982057 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:48Z","lastTransitionTime":"2025-12-03T12:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.084629 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.084686 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.084700 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.084721 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.084737 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.187447 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.187482 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.187490 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.187505 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.187515 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.291491 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.291560 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.291576 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.291598 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.291617 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.393929 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.393968 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.393978 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.394005 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.394020 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.495877 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.495944 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.495957 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.495974 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.495985 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.598259 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.598297 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.598308 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.598322 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.598333 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.701848 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.701880 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.701889 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.701902 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.701936 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.804235 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.804304 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.804328 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.804359 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.804383 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.817271 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:49 crc kubenswrapper[4711]: E1203 12:15:49.817434 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.907419 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.907473 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.907498 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.907532 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:49 crc kubenswrapper[4711]: I1203 12:15:49.907556 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:49Z","lastTransitionTime":"2025-12-03T12:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.010092 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.010138 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.010149 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.010167 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.010180 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.113025 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.113077 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.113128 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.113146 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.113158 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.215383 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.215656 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.215729 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.215844 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.215949 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.317855 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.318013 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.318035 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.318051 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.318064 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.420644 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.420674 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.420704 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.420719 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.420728 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.523498 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.523541 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.523553 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.523569 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.523581 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.625666 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.625730 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.625742 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.625759 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.625772 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.728122 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.728239 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.728250 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.728275 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.728289 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.817053 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.817053 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.817062 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:50 crc kubenswrapper[4711]: E1203 12:15:50.817189 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:50 crc kubenswrapper[4711]: E1203 12:15:50.817578 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:50 crc kubenswrapper[4711]: E1203 12:15:50.817671 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.817965 4711 scope.go:117] "RemoveContainer" containerID="f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3" Dec 03 12:15:50 crc kubenswrapper[4711]: E1203 12:15:50.818123 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.832075 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.832126 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.832138 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.832157 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.832169 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.934542 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.934574 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.934585 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.934599 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:50 crc kubenswrapper[4711]: I1203 12:15:50.934609 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:50Z","lastTransitionTime":"2025-12-03T12:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.037495 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.037557 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.037621 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.037645 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.037662 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.139736 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.139803 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.139821 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.139847 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.139863 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.201402 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/0.log" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.201449 4711 generic.go:334] "Generic (PLEG): container finished" podID="216c3ac8-462c-49ec-87a2-c935d0c4ad25" containerID="4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45" exitCode=1 Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.201481 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwhcr" event={"ID":"216c3ac8-462c-49ec-87a2-c935d0c4ad25","Type":"ContainerDied","Data":"4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.201882 4711 scope.go:117] "RemoveContainer" containerID="4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.219785 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.232679 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:50Z\\\",\\\"message\\\":\\\"2025-12-03T12:15:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f\\\\n2025-12-03T12:15:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f to /host/opt/cni/bin/\\\\n2025-12-03T12:15:05Z [verbose] multus-daemon started\\\\n2025-12-03T12:15:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:15:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.242008 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.242051 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.242061 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.242079 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.242091 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.249446 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.261219 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1df44dcc-77dc-4f9d-add9-f94f6e32a7e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.273216 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.284790 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.301005 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.318877 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.327267 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.337408 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.344132 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.344361 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.344441 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.344505 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.344561 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.349107 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.360075 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.371799 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.388696 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.398227 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.408583 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.419179 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.428254 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.446967 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.447013 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.447021 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.447034 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.447043 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.549818 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.549954 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.549966 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.549988 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.550000 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.652136 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.652177 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.652187 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.652199 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.652209 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.754200 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.754241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.754252 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.754268 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.754279 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.817028 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:51 crc kubenswrapper[4711]: E1203 12:15:51.817215 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.839950 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.851870 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.856342 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.856381 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.856392 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.856410 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.856422 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.864006 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.876100 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.885368 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.897934 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.911544 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:50Z\\\",\\\"message\\\":\\\"2025-12-03T12:15:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f\\\\n2025-12-03T12:15:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f to /host/opt/cni/bin/\\\\n2025-12-03T12:15:05Z [verbose] multus-daemon started\\\\n2025-12-03T12:15:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:15:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.924116 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.939041 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1df44dcc-77dc-4f9d-add9-f94f6e32a7e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.954138 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.959056 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.959176 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.959265 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.959335 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.959409 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:51Z","lastTransitionTime":"2025-12-03T12:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.968689 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.984873 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:51 crc kubenswrapper[4711]: I1203 12:15:51.996498 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.010978 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.027250 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.042401 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.055700 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.061526 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.061566 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.061576 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.061593 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.061604 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.067842 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.163418 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.163466 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.163479 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.163496 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.163508 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.206291 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/0.log" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.206531 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwhcr" event={"ID":"216c3ac8-462c-49ec-87a2-c935d0c4ad25","Type":"ContainerStarted","Data":"0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.220262 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.233885 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.244545 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.265440 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.265494 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.265504 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.265523 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.265534 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.266781 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.280381 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.296897 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.308780 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1df44dcc-77dc-4f9d-add9-f94f6e32a7e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.323638 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.339929 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.357547 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:50Z\\\",\\\"message\\\":\\\"2025-12-03T12:15:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f\\\\n2025-12-03T12:15:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f to /host/opt/cni/bin/\\\\n2025-12-03T12:15:05Z [verbose] multus-daemon started\\\\n2025-12-03T12:15:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:15:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.368580 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.368645 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.368659 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.368675 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.368685 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.374654 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.394498 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.408036 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.421064 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.434590 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.448977 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.462391 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.471316 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.471376 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.471387 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.471410 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.471422 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.474699 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.573501 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.573564 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.573583 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.573607 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.573623 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.676199 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.676235 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.676243 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.676257 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.676266 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.779049 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.779081 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.779090 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.779104 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.779114 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.816822 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:52 crc kubenswrapper[4711]: E1203 12:15:52.816938 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.817085 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:52 crc kubenswrapper[4711]: E1203 12:15:52.817157 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.817249 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:52 crc kubenswrapper[4711]: E1203 12:15:52.817429 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.881814 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.881863 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.881879 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.881895 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.881924 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.984196 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.984232 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.984242 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.984259 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:52 crc kubenswrapper[4711]: I1203 12:15:52.984271 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:52Z","lastTransitionTime":"2025-12-03T12:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.086350 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.086435 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.086464 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.086495 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.086516 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.189598 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.189660 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.189682 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.189708 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.189728 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.292254 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.292312 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.292321 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.292336 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.292349 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.395652 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.395722 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.395739 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.395765 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.395784 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.499217 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.499279 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.499297 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.499320 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.499336 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.603066 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.603161 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.603186 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.603220 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.603238 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.710389 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.710438 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.710454 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.710476 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.710491 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.813064 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.813806 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.813830 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.813854 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.813871 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.817041 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:53 crc kubenswrapper[4711]: E1203 12:15:53.817231 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.916227 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.916274 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.916283 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.916294 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:53 crc kubenswrapper[4711]: I1203 12:15:53.916304 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:53Z","lastTransitionTime":"2025-12-03T12:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.019356 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.019411 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.019423 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.019443 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.019455 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.122728 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.122794 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.122816 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.122847 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.122865 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.224691 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.224735 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.224759 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.224773 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.224783 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.328410 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.328479 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.328494 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.328519 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.328535 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.431344 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.431433 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.431445 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.431465 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.431495 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.534952 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.535012 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.535031 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.535055 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.535073 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.637405 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.637466 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.637479 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.637495 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.637508 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.740120 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.740173 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.740184 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.740258 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.740307 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.817088 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.817148 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:54 crc kubenswrapper[4711]: E1203 12:15:54.817245 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:54 crc kubenswrapper[4711]: E1203 12:15:54.817345 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.817440 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:54 crc kubenswrapper[4711]: E1203 12:15:54.817551 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.843060 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.843134 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.843142 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.843156 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.843166 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.946196 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.946246 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.946256 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.946269 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:54 crc kubenswrapper[4711]: I1203 12:15:54.946277 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:54Z","lastTransitionTime":"2025-12-03T12:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.049246 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.049401 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.049433 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.049466 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.049488 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.152004 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.152039 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.152050 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.152068 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.152080 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.255008 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.255082 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.255105 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.255133 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.255153 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.358215 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.358282 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.358305 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.358335 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.358359 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.461044 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.461110 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.461127 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.461153 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.461172 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.563402 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.563458 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.563473 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.563498 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.563515 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.666540 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.666634 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.666647 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.666695 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.666710 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.768507 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.768546 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.768555 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.768570 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.768579 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.817362 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:55 crc kubenswrapper[4711]: E1203 12:15:55.817514 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.870771 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.870835 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.870845 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.870876 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.870888 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.973496 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.973563 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.973584 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.973608 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:55 crc kubenswrapper[4711]: I1203 12:15:55.973625 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:55Z","lastTransitionTime":"2025-12-03T12:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.076732 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.076810 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.076828 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.076855 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.076878 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.134830 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.134865 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.134874 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.134887 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.134895 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.153436 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.165957 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.166008 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.166019 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.166039 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.166054 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.186163 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.191792 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.191865 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.191884 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.191935 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.191955 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.212139 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.217163 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.217208 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.217219 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.217236 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.217246 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.238061 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.242568 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.242601 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.242614 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.242631 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.242643 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.257283 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.257450 4711 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.259320 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.259393 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.259417 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.259448 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.259470 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.362672 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.362724 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.362740 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.362762 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.362779 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.466116 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.466146 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.466155 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.466168 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.466177 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.569133 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.569189 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.569209 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.569239 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.569259 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.671957 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.672027 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.672052 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.672082 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.672104 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.775354 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.775398 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.775410 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.775428 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.775442 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.816964 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.817090 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.816975 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.817168 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.817348 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:56 crc kubenswrapper[4711]: E1203 12:15:56.817443 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.877724 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.877839 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.877851 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.877870 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.877883 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.980764 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.980846 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.980868 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.980894 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:56 crc kubenswrapper[4711]: I1203 12:15:56.980944 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:56Z","lastTransitionTime":"2025-12-03T12:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.083193 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.083226 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.083234 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.083248 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.083258 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.185520 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.185551 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.185561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.185576 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.185587 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.289594 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.289666 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.289685 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.289707 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.289723 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.399282 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.399354 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.399377 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.399406 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.399430 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.502715 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.502754 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.502764 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.502779 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.502789 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.606176 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.606567 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.606745 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.606966 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.607163 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.709398 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.709456 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.709472 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.709496 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.709512 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.812814 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.812870 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.812890 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.812953 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.812972 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.827542 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:57 crc kubenswrapper[4711]: E1203 12:15:57.827745 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.916139 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.916174 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.916182 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.916195 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:57 crc kubenswrapper[4711]: I1203 12:15:57.916204 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:57Z","lastTransitionTime":"2025-12-03T12:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.019053 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.019138 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.019165 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.019199 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.019224 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.121858 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.121955 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.121982 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.122002 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.122016 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.224424 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.224472 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.224490 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.224523 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.224560 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.327065 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.327118 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.327127 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.327139 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.327148 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.430326 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.430358 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.430365 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.430378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.430388 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.533529 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.533636 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.533656 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.533719 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.533738 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.636305 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.636365 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.636376 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.636394 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.636406 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.739853 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.739937 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.739957 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.739978 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.739993 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.817124 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.817150 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.817132 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:58 crc kubenswrapper[4711]: E1203 12:15:58.817260 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:58 crc kubenswrapper[4711]: E1203 12:15:58.817387 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:58 crc kubenswrapper[4711]: E1203 12:15:58.817512 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.842086 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.842144 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.842158 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.842176 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.842191 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.945125 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.945201 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.945221 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.945245 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:58 crc kubenswrapper[4711]: I1203 12:15:58.945262 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:58Z","lastTransitionTime":"2025-12-03T12:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.047674 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.047716 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.047733 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.047755 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.047770 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.150047 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.150114 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.150134 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.150186 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.150226 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.252973 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.253976 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.254196 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.254434 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.254658 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.357340 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.357390 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.357405 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.357424 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.357434 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.460546 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.460589 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.460600 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.460617 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.460628 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.563310 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.563374 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.563386 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.563403 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.563420 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.666126 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.666207 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.666225 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.666244 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.666260 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.769125 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.769153 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.769161 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.769174 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.769193 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.817149 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:15:59 crc kubenswrapper[4711]: E1203 12:15:59.817298 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.871863 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.871965 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.871990 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.872018 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.872039 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.976016 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.976069 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.976089 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.976114 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:59 crc kubenswrapper[4711]: I1203 12:15:59.976134 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:59Z","lastTransitionTime":"2025-12-03T12:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.078458 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.078485 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.078493 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.078510 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.078520 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.181245 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.181279 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.181289 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.181304 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.181315 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.284530 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.284599 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.284623 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.284656 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.284678 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.386684 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.386736 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.386745 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.386760 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.386775 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.493870 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.494549 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.494965 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.495006 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.495024 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.597761 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.597819 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.597833 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.597851 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.597863 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.701553 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.701822 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.701889 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.701984 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.702047 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.806981 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.807353 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.807376 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.807402 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.807420 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.816484 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.816528 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.816503 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:00 crc kubenswrapper[4711]: E1203 12:16:00.816653 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:00 crc kubenswrapper[4711]: E1203 12:16:00.816740 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:00 crc kubenswrapper[4711]: E1203 12:16:00.816825 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.910828 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.911183 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.911272 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.911354 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:00 crc kubenswrapper[4711]: I1203 12:16:00.911435 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:00Z","lastTransitionTime":"2025-12-03T12:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.014558 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.014894 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.015040 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.015133 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.015218 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.117269 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.117311 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.117323 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.117339 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.117352 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.220693 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.221133 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.221233 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.221326 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.221415 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.325204 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.325881 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.326048 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.326176 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.326303 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.429901 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.430044 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.430068 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.430093 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.430111 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.533166 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.533230 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.533248 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.533271 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.533288 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.635528 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.635571 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.635581 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.635599 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.635613 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.738345 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.738380 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.738388 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.738404 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.738412 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.816852 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:01 crc kubenswrapper[4711]: E1203 12:16:01.817083 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.837397 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.842196 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.842274 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.842300 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.842333 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.842356 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.854549 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.868154 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.884033 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.897760 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.918726 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.933758 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.945260 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.945306 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.945321 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.945341 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.945356 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:01Z","lastTransitionTime":"2025-12-03T12:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.947588 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.961124 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.975990 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:01 crc kubenswrapper[4711]: I1203 12:16:01.990085 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.001374 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.028644 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.042178 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:50Z\\\",\\\"message\\\":\\\"2025-12-03T12:15:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f\\\\n2025-12-03T12:15:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f to /host/opt/cni/bin/\\\\n2025-12-03T12:15:05Z [verbose] multus-daemon started\\\\n2025-12-03T12:15:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:15:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.047644 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.047698 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.047715 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.047732 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.047744 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.058069 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.068925 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1df44dcc-77dc-4f9d-add9-f94f6e32a7e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.082529 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.094248 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.151136 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.151198 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.151215 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.151241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.151259 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.253401 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.253456 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.253476 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.253507 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.253530 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.356844 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.356900 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.356937 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.356957 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.356973 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.460411 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.460514 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.460530 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.460548 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.460559 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.563992 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.564079 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.564092 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.564115 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.564131 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.666857 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.666956 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.666970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.667005 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.667039 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.770240 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.770310 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.770325 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.770342 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.770353 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.817163 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.817186 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.817328 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:02 crc kubenswrapper[4711]: E1203 12:16:02.817322 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:02 crc kubenswrapper[4711]: E1203 12:16:02.817435 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:02 crc kubenswrapper[4711]: E1203 12:16:02.817584 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.873190 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.873252 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.873266 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.873287 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.873302 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.976037 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.976104 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.976122 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.976145 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:02 crc kubenswrapper[4711]: I1203 12:16:02.976163 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:02Z","lastTransitionTime":"2025-12-03T12:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.078210 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.078257 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.078269 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.078288 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.078302 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.181359 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.181434 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.181456 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.181486 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.181508 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.284098 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.284151 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.284175 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.284201 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.284218 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.387366 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.387412 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.387423 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.387443 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.387454 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.489886 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.489980 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.490008 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.490037 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.490060 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.593361 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.593415 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.593429 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.593455 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.593471 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.695961 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.696016 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.696032 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.696055 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.696075 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.799574 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.799644 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.799673 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.799704 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.799725 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.817434 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:03 crc kubenswrapper[4711]: E1203 12:16:03.817685 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.903099 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.903263 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.903291 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.903321 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:03 crc kubenswrapper[4711]: I1203 12:16:03.903341 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:03Z","lastTransitionTime":"2025-12-03T12:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.006842 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.006895 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.006945 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.007004 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.007023 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.109223 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.109272 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.109280 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.109293 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.109302 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.212213 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.212254 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.212264 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.212279 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.212288 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.315318 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.315355 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.315390 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.315403 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.315411 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.418300 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.418351 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.418386 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.418402 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.418410 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.520991 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.521038 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.521049 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.521065 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.521075 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.620874 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.621096 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:08.621067703 +0000 UTC m=+147.290318958 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.623356 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.623378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.623386 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.623399 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.623409 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.721688 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.721805 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.721857 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.721983 4711 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722048 4711 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722088 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:08.722057585 +0000 UTC m=+147.391308890 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722116 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.721986 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722152 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722175 4711 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722050 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722256 4711 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722275 4711 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722124 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:08.722105856 +0000 UTC m=+147.391357161 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722347 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:08.722324992 +0000 UTC m=+147.391576287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.722375 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:08.722360123 +0000 UTC m=+147.391611418 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.726543 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.726601 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.726623 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.726686 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.726782 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.816313 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.816406 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.816336 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.816502 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.816636 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:04 crc kubenswrapper[4711]: E1203 12:16:04.816736 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.829738 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.829781 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.829790 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.829811 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.829822 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.932042 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.932087 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.932099 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.932119 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:04 crc kubenswrapper[4711]: I1203 12:16:04.932130 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:04Z","lastTransitionTime":"2025-12-03T12:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.034520 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.034549 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.034558 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.034570 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.034578 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.137427 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.137475 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.137487 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.137503 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.137514 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.239501 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.239548 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.239565 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.239586 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.239599 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.341772 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.341817 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.341826 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.341841 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.341851 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.444477 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.444525 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.444539 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.444556 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.444567 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.547289 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.547328 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.547336 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.547350 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.547361 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.650005 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.650037 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.650047 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.650100 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.650113 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.753493 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.753561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.753585 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.753612 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.753630 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.817132 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:05 crc kubenswrapper[4711]: E1203 12:16:05.817334 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.818757 4711 scope.go:117] "RemoveContainer" containerID="f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.856396 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.856433 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.856441 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.856456 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.856466 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.960066 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.960120 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.960133 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.960152 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:05 crc kubenswrapper[4711]: I1203 12:16:05.960165 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:05Z","lastTransitionTime":"2025-12-03T12:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.063950 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.064253 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.064261 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.064278 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.064287 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.166634 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.166673 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.166682 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.166697 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.166707 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.253747 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/2.log" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.256425 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.256825 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.269188 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.269228 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.269237 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.269251 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.269260 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.269871 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.284891 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.327853 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.339517 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.350265 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.357296 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.357334 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.357342 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.357356 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.357366 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.362574 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.371612 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.374316 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.379381 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.379420 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.379432 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.379450 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.379463 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.393466 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.396087 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.397455 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.397493 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.397505 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.397520 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.397530 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.407026 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.410245 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.414679 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.414880 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.414999 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.415211 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.415371 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.420879 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1df44dcc-77dc-4f9d-add9-f94f6e32a7e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.428366 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.431698 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.431758 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.431775 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.431797 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.431812 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.431921 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.443529 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.444774 4711 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b4f6397-1a81-4c19-aa75-c22851b76849\\\",\\\"systemUUID\\\":\\\"4ce16df9-f35c-45ca-ad31-f0686ce28357\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.444948 4711 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.446534 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.446626 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.446680 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.446737 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.446803 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.457411 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:50Z\\\",\\\"message\\\":\\\"2025-12-03T12:15:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f\\\\n2025-12-03T12:15:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f to /host/opt/cni/bin/\\\\n2025-12-03T12:15:05Z [verbose] multus-daemon started\\\\n2025-12-03T12:15:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:15:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.478062 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.492473 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.505304 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.514860 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.525037 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.549172 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.549198 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.549206 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.549218 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.549228 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.652314 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.652403 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.652418 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.652444 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.652465 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.755830 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.755887 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.755901 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.755952 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.755972 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.817126 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.817148 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.817236 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.817621 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.817470 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:06 crc kubenswrapper[4711]: E1203 12:16:06.817773 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.859297 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.859349 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.859359 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.859378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.859389 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.962189 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.962225 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.962235 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.962250 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:06 crc kubenswrapper[4711]: I1203 12:16:06.962261 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:06Z","lastTransitionTime":"2025-12-03T12:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.064603 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.064658 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.064672 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.064695 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.064711 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.167578 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.167612 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.167622 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.167637 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.167647 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.260841 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/3.log" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.261896 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/2.log" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.264544 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" exitCode=1 Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.264582 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.264618 4711 scope.go:117] "RemoveContainer" containerID="f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.265411 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:16:07 crc kubenswrapper[4711]: E1203 12:16:07.265589 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.278983 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.279208 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.279275 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.279336 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.279392 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.295449 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f678b995eb69a7797bde27947370a777ae334b18c0c79be36dd090741330d5b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:33Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.666262 6363 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666491 6363 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:15:33.666538 6363 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666700 6363 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.666829 6363 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:15:33.667208 6363 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:15:33.668065 6363 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 12:15:33.668094 6363 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 12:15:33.668128 6363 factory.go:656] Stopping watch factory\\\\nI1203 12:15:33.668129 6363 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 12:16:06.639143 6709 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.306275 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.319404 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.333232 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.345024 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.357763 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.368308 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:50Z\\\",\\\"message\\\":\\\"2025-12-03T12:15:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f\\\\n2025-12-03T12:15:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f to /host/opt/cni/bin/\\\\n2025-12-03T12:15:05Z [verbose] multus-daemon started\\\\n2025-12-03T12:15:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:15:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.381367 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.381407 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.381417 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.381432 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.381441 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.381282 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.390695 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1df44dcc-77dc-4f9d-add9-f94f6e32a7e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.400489 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.409247 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.420066 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.431567 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.439566 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.448607 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.458448 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.468554 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.478867 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.483717 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.483780 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.483791 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.483811 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.483823 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.586404 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.586447 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.586458 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.586477 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.586488 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.689634 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.689681 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.689698 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.689721 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.689737 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.791820 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.791854 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.791864 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.791881 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.791892 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.817086 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:07 crc kubenswrapper[4711]: E1203 12:16:07.817241 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.894724 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.894763 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.894774 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.894791 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.894802 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.997854 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.997973 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.998015 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.998050 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:07 crc kubenswrapper[4711]: I1203 12:16:07.998072 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:07Z","lastTransitionTime":"2025-12-03T12:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.101474 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.101514 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.101523 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.101537 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.101549 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.204239 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.204288 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.204300 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.204318 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.204332 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.270176 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/3.log" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.276689 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:16:08 crc kubenswrapper[4711]: E1203 12:16:08.277260 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.295425 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.307856 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.307964 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.307993 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.308057 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.308084 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.315751 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.333298 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.365310 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 12:16:06.639143 6709 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:16:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.382132 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.398311 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1df44dcc-77dc-4f9d-add9-f94f6e32a7e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.411788 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.411824 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.411833 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.411849 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.411858 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.415306 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.434390 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.451074 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:50Z\\\",\\\"message\\\":\\\"2025-12-03T12:15:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f\\\\n2025-12-03T12:15:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f to /host/opt/cni/bin/\\\\n2025-12-03T12:15:05Z [verbose] multus-daemon started\\\\n2025-12-03T12:15:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:15:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.471251 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.491362 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.510574 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.515745 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.515818 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.515836 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.515864 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.515891 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.524429 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.537751 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.551070 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.566357 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.578983 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.591970 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:08Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.619430 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.619470 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.619481 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.619518 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.619529 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.723787 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.723831 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.723841 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.723859 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.723869 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.816594 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.816594 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.816719 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:08 crc kubenswrapper[4711]: E1203 12:16:08.816987 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:08 crc kubenswrapper[4711]: E1203 12:16:08.817174 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:08 crc kubenswrapper[4711]: E1203 12:16:08.817282 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.828663 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.829008 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.829029 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.829057 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.829072 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.932106 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.932150 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.932163 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.932181 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:08 crc kubenswrapper[4711]: I1203 12:16:08.932195 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:08Z","lastTransitionTime":"2025-12-03T12:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.034653 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.034685 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.034694 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.034708 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.034718 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.137106 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.137144 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.137152 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.137166 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.137175 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.239434 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.239501 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.239517 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.239540 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.239555 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.342775 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.342878 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.342894 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.342959 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.342973 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.445102 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.445168 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.445180 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.445195 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.445208 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.547543 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.547613 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.547655 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.547674 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.547686 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.649884 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.649959 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.649972 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.649986 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.649996 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.752375 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.752443 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.752466 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.752497 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.752517 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.817038 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:09 crc kubenswrapper[4711]: E1203 12:16:09.817251 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.856523 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.856563 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.856575 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.856592 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.856603 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.959850 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.960163 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.960298 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.960433 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:09 crc kubenswrapper[4711]: I1203 12:16:09.960707 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:09Z","lastTransitionTime":"2025-12-03T12:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.063339 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.063371 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.063380 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.063394 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.063403 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.166386 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.166425 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.166435 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.166452 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.166463 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.268546 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.268612 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.268629 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.268651 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.268669 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.372521 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.372567 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.372583 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.372602 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.372617 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.475000 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.475035 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.475045 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.475061 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.475072 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.577840 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.578004 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.578068 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.578095 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.578112 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.679884 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.679946 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.679958 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.679975 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.679985 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.783398 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.783479 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.783503 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.783538 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.783557 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.817199 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:10 crc kubenswrapper[4711]: E1203 12:16:10.817365 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.817393 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.817492 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:10 crc kubenswrapper[4711]: E1203 12:16:10.817957 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:10 crc kubenswrapper[4711]: E1203 12:16:10.818146 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.886142 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.886225 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.886241 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.886268 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.886286 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.989473 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.989526 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.989540 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.989557 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:10 crc kubenswrapper[4711]: I1203 12:16:10.989570 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:10Z","lastTransitionTime":"2025-12-03T12:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.094347 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.094415 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.094429 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.094448 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.094610 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.198079 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.198128 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.198138 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.198155 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.198166 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.300697 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.300773 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.300788 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.300807 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.300820 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.403938 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.403986 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.403998 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.404014 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.404024 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.507060 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.507122 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.507132 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.507145 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.507153 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.610809 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.610869 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.610884 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.610923 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.610937 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.713111 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.713144 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.713155 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.713170 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.713181 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.815561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.815629 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.815659 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.815677 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.815690 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.816537 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:11 crc kubenswrapper[4711]: E1203 12:16:11.816622 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.829863 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1df44dcc-77dc-4f9d-add9-f94f6e32a7e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9805346e5e71e3954a1c2e83f16f8ec65048b2d76f9def6362e4a65191e15678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a722d33ed10d5693ae916d46d2bac91ce7d7709f0933ce882ea8e1292700de2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.842626 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.855700 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.869732 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwhcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216c3ac8-462c-49ec-87a2-c935d0c4ad25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:15:50Z\\\",\\\"message\\\":\\\"2025-12-03T12:15:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f\\\\n2025-12-03T12:15:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0aff7da2-5e80-4447-b3fa-e429af0f7d7f to /host/opt/cni/bin/\\\\n2025-12-03T12:15:05Z [verbose] multus-daemon started\\\\n2025-12-03T12:15:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:15:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ztlqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwhcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.889864 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sfpts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f176e2-84b4-4f7d-bf31-94ecb9f59e90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80ca10d47adbb1e21793d3c069a06e2bcaf9134f0a194fd74c97632317dbbaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d82aae7ea514c863d55fe606d022d67d7b0f770bd5c659964d75412c30e206ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595e7cdc2a3b8a8a496fabec98afece84a9f1a4cd38c881b225bfbae4d9166d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://551e71e633b7f6367898945bff21fccd1ebce990fd5a63a90098ae4dd7f42566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a6c4b26c795cf3540b90e9a7c2b9a683251b798e54739b1b881528fbb564b2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0af7e1ad11ca77e50fcd64079e1975de934234c8c24d706dfe77401bdc3a007\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3ab0e9bb027d7e9109b336238bcf22f6a48a4a4f914ca5173b1e510a580b80a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-px98s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sfpts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.902786 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1d3b7df-87c6-4d58-a825-f8b19ad98966\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:14:54.104439 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:14:54.106778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1353459368/tls.crt::/tmp/serving-cert-1353459368/tls.key\\\\\\\"\\\\nI1203 12:15:00.569392 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:15:00.574055 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:15:00.574123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:15:00.574178 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:15:00.574193 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:15:00.586232 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 12:15:00.586426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586439 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:15:00.586449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:15:00.586455 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:15:00.586461 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:15:00.586467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 12:15:00.586320 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 12:15:00.590191 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.934758 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.934795 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.934804 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.934818 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.934828 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:11Z","lastTransitionTime":"2025-12-03T12:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.941530 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25c00d4111a3842d913a3dd790ea96c94024752369b73b82c74e52b27ffa944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc6321c95e83eaeae174cf287ad1526fca7f5827c085bbeee545afb90ce60c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.957817 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n9ptm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b974b5-a578-4c3c-b6a0-0038d19cb565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251dd2455f51423f7ca6f72a25ee5e041a09be206e4d8b62a75df6c027788c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpcvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n9ptm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:11 crc kubenswrapper[4711]: I1203 12:16:11.996864 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114241a2-fe6b-43a6-957c-d215ce55737a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7bdd5e9a9885bccbf422740ad714b91e3eb1cc829bddaa977540e1e2be4b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458346453f3d04e596b36d75a5a157cb23f385d76b0afad50ace54255e6b7553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pll8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h648m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.008055 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b503bd-1713-49ea-b76a-afcb522fbee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b75883e4afd490ed232b2c3f9e8661c6bd83e8bef834a382466c60cca5b1ca81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee79271dfb63674f6ea4b9ebe300d1fe6ffe47a61dc8a77154cbc196671ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5362dc9d7af7da68aa2c1c37c3c20c7bccb952706ad3a5c9d19eeea601aaec9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.019498 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8fc603c451e5119321fb200e1a9fab4779cbf13861ce3e5eb3d93025f9f6109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.031288 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.036976 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.037017 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.037028 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.037049 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.037062 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.041320 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776e7d35-d59b-4d4a-97cd-aec4f2441c1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b180555dddd49717acf46cda5fc0fa026eea6ec90f08786cb481802f15934e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rmcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-52jgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.051021 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c91e83d-5ec0-4d5d-a152-006c7c6c00c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1492a930ad0b8a0e49c66235300e2d199ff927035f6dfa3bd51aefc59470dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200cc11da19da4cb96844e0e59f1114f5bff8bedda1e49dac120f7a20f30b834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f7fe5dd73fcc75c47d02eaace24544071dbb6fd692964dc4e03df24e0fdfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19df5db43c6bd42f2676230cceec988e2976753f3b41b3d5381a18910570c4a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.061618 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe6a9016ae4ebaf095fcd0887f0c8e6cdf905e4729447f54a0d964b48d2f916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.069494 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g4t8g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fe37859-a67f-4f4b-9c03-57db1ba5e5e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7edc3720e7f587c43d01edc901d0386f261008b7bcfb9a4e87192aedcb7a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfh4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g4t8g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.087522 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33d2332f-fdac-42be-891e-7eaef0e7ca9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:16:06Z\\\",\\\"message\\\":\\\"go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 12:16:06.639143 6709 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:16:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5vs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ct6xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.098428 4711 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb7f01e-b2fd-49da-b7de-621da238d797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s66kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd9tz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:16:12Z is after 2025-08-24T17:21:41Z" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.138966 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.139032 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.139057 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.139125 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.139158 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.242358 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.242408 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.242418 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.242434 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.242446 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.345714 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.345775 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.345792 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.345814 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.345835 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.448585 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.448638 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.448657 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.448677 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.448692 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.551848 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.551963 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.551978 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.551997 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.552008 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.654214 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.654251 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.654262 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.654275 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.654286 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.756160 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.756208 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.756221 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.756243 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.756258 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.816844 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.816875 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.816897 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:12 crc kubenswrapper[4711]: E1203 12:16:12.817208 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:12 crc kubenswrapper[4711]: E1203 12:16:12.817388 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:12 crc kubenswrapper[4711]: E1203 12:16:12.817453 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.859798 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.859857 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.859870 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.859891 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.859926 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.963293 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.963327 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.963337 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.963353 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:12 crc kubenswrapper[4711]: I1203 12:16:12.963363 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:12Z","lastTransitionTime":"2025-12-03T12:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.067091 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.067223 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.067256 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.067287 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.067309 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.170859 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.170951 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.170971 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.171000 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.171017 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.274132 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.274212 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.274224 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.274246 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.274262 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.376887 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.376943 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.376956 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.376973 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.376985 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.487719 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.487772 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.487789 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.487810 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.487829 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.591066 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.591113 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.591127 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.591151 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.591166 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.693494 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.693546 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.693558 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.693574 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.693586 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.796532 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.796567 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.796579 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.796593 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.796603 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.816740 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:13 crc kubenswrapper[4711]: E1203 12:16:13.816894 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.899233 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.899266 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.899276 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.899293 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:13 crc kubenswrapper[4711]: I1203 12:16:13.899310 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:13Z","lastTransitionTime":"2025-12-03T12:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.002671 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.002723 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.002734 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.002751 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.002764 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.108032 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.108083 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.108116 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.108139 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.108156 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.210965 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.211008 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.211018 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.211036 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.211050 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.313594 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.313634 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.313648 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.313665 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.313678 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.416705 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.416778 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.416799 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.416829 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.416852 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.519557 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.519617 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.519634 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.519658 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.519677 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.625989 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.626060 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.626083 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.626110 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.626126 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.729847 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.729942 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.729964 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.729991 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.730012 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.816790 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:14 crc kubenswrapper[4711]: E1203 12:16:14.817005 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.817218 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.817292 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:14 crc kubenswrapper[4711]: E1203 12:16:14.817406 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:14 crc kubenswrapper[4711]: E1203 12:16:14.817587 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.837997 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.838049 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.838067 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.838089 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.838106 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.940686 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.940723 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.940733 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.940750 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:14 crc kubenswrapper[4711]: I1203 12:16:14.940773 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:14Z","lastTransitionTime":"2025-12-03T12:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.044745 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.044793 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.044806 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.044833 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.044854 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.146859 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.146934 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.146951 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.146969 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.146981 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.249720 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.249753 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.249761 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.249774 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.249783 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.353007 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.353054 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.353063 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.353077 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.353087 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.456248 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.456305 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.456321 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.456339 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.456350 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.558427 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.558494 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.558511 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.558534 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.558553 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.661216 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.661259 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.661273 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.661291 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.661304 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.764093 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.764136 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.764146 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.764167 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.764179 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.816955 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:15 crc kubenswrapper[4711]: E1203 12:16:15.817141 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.866503 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.866551 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.866566 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.866582 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.866592 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.969861 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.969970 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.969991 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.970014 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:15 crc kubenswrapper[4711]: I1203 12:16:15.970030 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:15Z","lastTransitionTime":"2025-12-03T12:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.072020 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.072063 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.072075 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.072095 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.072106 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:16Z","lastTransitionTime":"2025-12-03T12:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.175520 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.175561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.175575 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.175591 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.175602 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:16Z","lastTransitionTime":"2025-12-03T12:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.278458 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.278492 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.278501 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.278515 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.278525 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:16Z","lastTransitionTime":"2025-12-03T12:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.381583 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.381631 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.381643 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.381660 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.381672 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:16Z","lastTransitionTime":"2025-12-03T12:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.485128 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.485202 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.485221 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.485247 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.485269 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:16Z","lastTransitionTime":"2025-12-03T12:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.587886 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.587975 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.587998 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.588019 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.588038 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:16Z","lastTransitionTime":"2025-12-03T12:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.691303 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.691350 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.691362 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.691378 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.691387 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:16Z","lastTransitionTime":"2025-12-03T12:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.792561 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.792628 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.792645 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.792671 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.792688 4711 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:16:16Z","lastTransitionTime":"2025-12-03T12:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.817125 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.817174 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.817226 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:16 crc kubenswrapper[4711]: E1203 12:16:16.817302 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:16 crc kubenswrapper[4711]: E1203 12:16:16.817367 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:16 crc kubenswrapper[4711]: E1203 12:16:16.817547 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.846243 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng"] Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.846623 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.849670 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.849821 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.849942 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.850040 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.882414 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.882389796 podStartE2EDuration="1m13.882389796s" podCreationTimestamp="2025-12-03 12:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:16.866420501 +0000 UTC m=+95.535671786" watchObservedRunningTime="2025-12-03 12:16:16.882389796 +0000 UTC m=+95.551641051" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.913448 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podStartSLOduration=74.913429455 podStartE2EDuration="1m14.913429455s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:16.913339692 +0000 UTC m=+95.582590967" watchObservedRunningTime="2025-12-03 12:16:16.913429455 +0000 UTC m=+95.582680710" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.925869 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=40.925852715 podStartE2EDuration="40.925852715s" podCreationTimestamp="2025-12-03 12:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:16.925445195 +0000 UTC m=+95.594696470" watchObservedRunningTime="2025-12-03 12:16:16.925852715 +0000 UTC m=+95.595103970" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.944839 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g4t8g" podStartSLOduration=74.944822911 podStartE2EDuration="1m14.944822911s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:16.944287117 +0000 UTC m=+95.613538382" watchObservedRunningTime="2025-12-03 12:16:16.944822911 +0000 UTC m=+95.614074166" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.958971 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139d0d26-049c-4651-be9c-4de1179a5c3f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.959025 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/139d0d26-049c-4651-be9c-4de1179a5c3f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.959067 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/139d0d26-049c-4651-be9c-4de1179a5c3f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.959129 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/139d0d26-049c-4651-be9c-4de1179a5c3f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:16 crc kubenswrapper[4711]: I1203 12:16:16.959147 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139d0d26-049c-4651-be9c-4de1179a5c3f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.003753 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sfpts" podStartSLOduration=75.003729631 podStartE2EDuration="1m15.003729631s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:16.992400199 +0000 UTC m=+95.661651454" watchObservedRunningTime="2025-12-03 12:16:17.003729631 +0000 UTC m=+95.672980886" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.018205 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.018189547 podStartE2EDuration="29.018189547s" podCreationTimestamp="2025-12-03 12:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:17.005021795 +0000 UTC m=+95.674273070" watchObservedRunningTime="2025-12-03 12:16:17.018189547 +0000 UTC m=+95.687440812" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.047792 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gwhcr" podStartSLOduration=75.047773475 podStartE2EDuration="1m15.047773475s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:17.047453256 +0000 UTC m=+95.716704521" watchObservedRunningTime="2025-12-03 12:16:17.047773475 +0000 UTC m=+95.717024730" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.059770 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/139d0d26-049c-4651-be9c-4de1179a5c3f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.059807 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139d0d26-049c-4651-be9c-4de1179a5c3f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.059845 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/139d0d26-049c-4651-be9c-4de1179a5c3f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.059894 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/139d0d26-049c-4651-be9c-4de1179a5c3f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.059927 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139d0d26-049c-4651-be9c-4de1179a5c3f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.059988 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/139d0d26-049c-4651-be9c-4de1179a5c3f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.060016 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/139d0d26-049c-4651-be9c-4de1179a5c3f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.060859 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/139d0d26-049c-4651-be9c-4de1179a5c3f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.065313 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139d0d26-049c-4651-be9c-4de1179a5c3f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.075447 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.075430692 podStartE2EDuration="1m17.075430692s" podCreationTimestamp="2025-12-03 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:17.062505238 +0000 UTC m=+95.731756503" watchObservedRunningTime="2025-12-03 12:16:17.075430692 +0000 UTC m=+95.744681947" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.077666 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139d0d26-049c-4651-be9c-4de1179a5c3f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cfgng\" (UID: \"139d0d26-049c-4651-be9c-4de1179a5c3f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.084605 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n9ptm" podStartSLOduration=75.084585257 podStartE2EDuration="1m15.084585257s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:17.084293679 +0000 UTC m=+95.753544954" watchObservedRunningTime="2025-12-03 12:16:17.084585257 +0000 UTC m=+95.753836502" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.095764 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h648m" podStartSLOduration=75.095748535 podStartE2EDuration="1m15.095748535s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:17.094820879 +0000 UTC m=+95.764072144" watchObservedRunningTime="2025-12-03 12:16:17.095748535 +0000 UTC m=+95.764999790" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.160604 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.300780 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" event={"ID":"139d0d26-049c-4651-be9c-4de1179a5c3f","Type":"ContainerStarted","Data":"cf5188ce6774c29835126964382ce6ef9a10f2cfd997814621ff3e3e7ed1cbba"} Dec 03 12:16:17 crc kubenswrapper[4711]: I1203 12:16:17.817130 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:17 crc kubenswrapper[4711]: E1203 12:16:17.817315 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:18 crc kubenswrapper[4711]: I1203 12:16:18.306164 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" event={"ID":"139d0d26-049c-4651-be9c-4de1179a5c3f","Type":"ContainerStarted","Data":"e5339c6f83a5ae54bc2a54ceec3beb081587e2752e2a7c16eaed3154a4194621"} Dec 03 12:16:18 crc kubenswrapper[4711]: I1203 12:16:18.320191 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cfgng" podStartSLOduration=76.320176463 podStartE2EDuration="1m16.320176463s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:18.319760902 +0000 UTC m=+96.989012167" watchObservedRunningTime="2025-12-03 12:16:18.320176463 +0000 UTC m=+96.989427718" Dec 03 12:16:18 crc kubenswrapper[4711]: I1203 12:16:18.816853 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:18 crc kubenswrapper[4711]: I1203 12:16:18.816933 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:18 crc kubenswrapper[4711]: I1203 12:16:18.816857 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:18 crc kubenswrapper[4711]: E1203 12:16:18.816986 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:18 crc kubenswrapper[4711]: E1203 12:16:18.817191 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:18 crc kubenswrapper[4711]: E1203 12:16:18.817299 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:19 crc kubenswrapper[4711]: I1203 12:16:19.816320 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:19 crc kubenswrapper[4711]: E1203 12:16:19.816529 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:20 crc kubenswrapper[4711]: I1203 12:16:20.596051 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:20 crc kubenswrapper[4711]: E1203 12:16:20.596199 4711 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:16:20 crc kubenswrapper[4711]: E1203 12:16:20.596278 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs podName:cdb7f01e-b2fd-49da-b7de-621da238d797 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:24.596254945 +0000 UTC m=+163.265506220 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs") pod "network-metrics-daemon-wd9tz" (UID: "cdb7f01e-b2fd-49da-b7de-621da238d797") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:16:20 crc kubenswrapper[4711]: I1203 12:16:20.816631 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:20 crc kubenswrapper[4711]: E1203 12:16:20.816784 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:20 crc kubenswrapper[4711]: I1203 12:16:20.816795 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:20 crc kubenswrapper[4711]: I1203 12:16:20.817058 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:20 crc kubenswrapper[4711]: E1203 12:16:20.817346 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:20 crc kubenswrapper[4711]: E1203 12:16:20.817471 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:20 crc kubenswrapper[4711]: I1203 12:16:20.817667 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:16:20 crc kubenswrapper[4711]: E1203 12:16:20.817831 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:16:21 crc kubenswrapper[4711]: I1203 12:16:21.817149 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:21 crc kubenswrapper[4711]: E1203 12:16:21.818312 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:21 crc kubenswrapper[4711]: I1203 12:16:21.836232 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 12:16:22 crc kubenswrapper[4711]: I1203 12:16:22.817034 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:22 crc kubenswrapper[4711]: I1203 12:16:22.817094 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:22 crc kubenswrapper[4711]: I1203 12:16:22.817031 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:22 crc kubenswrapper[4711]: E1203 12:16:22.817277 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:22 crc kubenswrapper[4711]: E1203 12:16:22.817184 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:22 crc kubenswrapper[4711]: E1203 12:16:22.817422 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:23 crc kubenswrapper[4711]: I1203 12:16:23.817182 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:23 crc kubenswrapper[4711]: E1203 12:16:23.817870 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:24 crc kubenswrapper[4711]: I1203 12:16:24.816249 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:24 crc kubenswrapper[4711]: I1203 12:16:24.816261 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:24 crc kubenswrapper[4711]: I1203 12:16:24.816274 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:24 crc kubenswrapper[4711]: E1203 12:16:24.816470 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:24 crc kubenswrapper[4711]: E1203 12:16:24.816647 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:24 crc kubenswrapper[4711]: E1203 12:16:24.816773 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:25 crc kubenswrapper[4711]: I1203 12:16:25.816724 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:25 crc kubenswrapper[4711]: E1203 12:16:25.816852 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:26 crc kubenswrapper[4711]: I1203 12:16:26.816664 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:26 crc kubenswrapper[4711]: I1203 12:16:26.816726 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:26 crc kubenswrapper[4711]: I1203 12:16:26.816797 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:26 crc kubenswrapper[4711]: E1203 12:16:26.816875 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:26 crc kubenswrapper[4711]: E1203 12:16:26.817056 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:26 crc kubenswrapper[4711]: E1203 12:16:26.817126 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:27 crc kubenswrapper[4711]: I1203 12:16:27.816481 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:27 crc kubenswrapper[4711]: E1203 12:16:27.816627 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:28 crc kubenswrapper[4711]: I1203 12:16:28.816815 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:28 crc kubenswrapper[4711]: I1203 12:16:28.816855 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:28 crc kubenswrapper[4711]: E1203 12:16:28.817132 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:28 crc kubenswrapper[4711]: I1203 12:16:28.817153 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:28 crc kubenswrapper[4711]: E1203 12:16:28.817356 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:28 crc kubenswrapper[4711]: E1203 12:16:28.817454 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:29 crc kubenswrapper[4711]: I1203 12:16:29.817217 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:29 crc kubenswrapper[4711]: E1203 12:16:29.817430 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:30 crc kubenswrapper[4711]: I1203 12:16:30.817372 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:30 crc kubenswrapper[4711]: I1203 12:16:30.817404 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:30 crc kubenswrapper[4711]: E1203 12:16:30.817530 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:30 crc kubenswrapper[4711]: I1203 12:16:30.817359 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:30 crc kubenswrapper[4711]: E1203 12:16:30.817754 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:30 crc kubenswrapper[4711]: E1203 12:16:30.817980 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:31 crc kubenswrapper[4711]: I1203 12:16:31.817175 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:31 crc kubenswrapper[4711]: E1203 12:16:31.818086 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:32 crc kubenswrapper[4711]: I1203 12:16:32.816722 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:32 crc kubenswrapper[4711]: I1203 12:16:32.816830 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:32 crc kubenswrapper[4711]: E1203 12:16:32.816882 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:32 crc kubenswrapper[4711]: E1203 12:16:32.817019 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:32 crc kubenswrapper[4711]: I1203 12:16:32.816845 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:32 crc kubenswrapper[4711]: E1203 12:16:32.817232 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:33 crc kubenswrapper[4711]: I1203 12:16:33.817225 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:33 crc kubenswrapper[4711]: E1203 12:16:33.817407 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:34 crc kubenswrapper[4711]: I1203 12:16:34.817241 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:34 crc kubenswrapper[4711]: I1203 12:16:34.817342 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:34 crc kubenswrapper[4711]: I1203 12:16:34.817358 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:34 crc kubenswrapper[4711]: E1203 12:16:34.817456 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:34 crc kubenswrapper[4711]: E1203 12:16:34.817569 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:34 crc kubenswrapper[4711]: E1203 12:16:34.817714 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:34 crc kubenswrapper[4711]: I1203 12:16:34.818451 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:16:34 crc kubenswrapper[4711]: E1203 12:16:34.818616 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:16:35 crc kubenswrapper[4711]: I1203 12:16:35.817324 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:35 crc kubenswrapper[4711]: E1203 12:16:35.817469 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:36 crc kubenswrapper[4711]: I1203 12:16:36.817133 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:36 crc kubenswrapper[4711]: I1203 12:16:36.817250 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:36 crc kubenswrapper[4711]: I1203 12:16:36.817362 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:36 crc kubenswrapper[4711]: E1203 12:16:36.817378 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:36 crc kubenswrapper[4711]: E1203 12:16:36.817480 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:36 crc kubenswrapper[4711]: E1203 12:16:36.817657 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:37 crc kubenswrapper[4711]: I1203 12:16:37.368341 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/1.log" Dec 03 12:16:37 crc kubenswrapper[4711]: I1203 12:16:37.368941 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/0.log" Dec 03 12:16:37 crc kubenswrapper[4711]: I1203 12:16:37.369003 4711 generic.go:334] "Generic (PLEG): container finished" podID="216c3ac8-462c-49ec-87a2-c935d0c4ad25" containerID="0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093" exitCode=1 Dec 03 12:16:37 crc kubenswrapper[4711]: I1203 12:16:37.369044 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwhcr" event={"ID":"216c3ac8-462c-49ec-87a2-c935d0c4ad25","Type":"ContainerDied","Data":"0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093"} Dec 03 12:16:37 crc kubenswrapper[4711]: I1203 12:16:37.369104 4711 scope.go:117] "RemoveContainer" containerID="4bee637a59992f41463c27548a28195ab75f4d1447509291651c421b01c72f45" Dec 03 12:16:37 crc kubenswrapper[4711]: I1203 12:16:37.369551 4711 scope.go:117] "RemoveContainer" containerID="0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093" Dec 03 12:16:37 crc kubenswrapper[4711]: E1203 12:16:37.369758 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-gwhcr_openshift-multus(216c3ac8-462c-49ec-87a2-c935d0c4ad25)\"" pod="openshift-multus/multus-gwhcr" podUID="216c3ac8-462c-49ec-87a2-c935d0c4ad25" Dec 03 12:16:37 crc kubenswrapper[4711]: I1203 12:16:37.389226 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=16.389204329000002 podStartE2EDuration="16.389204329s" podCreationTimestamp="2025-12-03 12:16:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:31.851563704 +0000 UTC m=+110.520815049" watchObservedRunningTime="2025-12-03 12:16:37.389204329 +0000 UTC m=+116.058455594" Dec 03 12:16:37 crc kubenswrapper[4711]: I1203 12:16:37.816507 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:37 crc kubenswrapper[4711]: E1203 12:16:37.816670 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:38 crc kubenswrapper[4711]: I1203 12:16:38.375253 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/1.log" Dec 03 12:16:38 crc kubenswrapper[4711]: I1203 12:16:38.817016 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:38 crc kubenswrapper[4711]: I1203 12:16:38.817060 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:38 crc kubenswrapper[4711]: I1203 12:16:38.817058 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:38 crc kubenswrapper[4711]: E1203 12:16:38.817138 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:38 crc kubenswrapper[4711]: E1203 12:16:38.817321 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:38 crc kubenswrapper[4711]: E1203 12:16:38.817424 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:39 crc kubenswrapper[4711]: I1203 12:16:39.817214 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:39 crc kubenswrapper[4711]: E1203 12:16:39.817530 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:40 crc kubenswrapper[4711]: I1203 12:16:40.816487 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:40 crc kubenswrapper[4711]: I1203 12:16:40.816578 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:40 crc kubenswrapper[4711]: E1203 12:16:40.816672 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:40 crc kubenswrapper[4711]: E1203 12:16:40.816742 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:40 crc kubenswrapper[4711]: I1203 12:16:40.816826 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:40 crc kubenswrapper[4711]: E1203 12:16:40.816953 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:41 crc kubenswrapper[4711]: I1203 12:16:41.816572 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:41 crc kubenswrapper[4711]: E1203 12:16:41.818550 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:41 crc kubenswrapper[4711]: E1203 12:16:41.831167 4711 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 12:16:41 crc kubenswrapper[4711]: E1203 12:16:41.909655 4711 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 12:16:42 crc kubenswrapper[4711]: I1203 12:16:42.817168 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:42 crc kubenswrapper[4711]: I1203 12:16:42.817246 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:42 crc kubenswrapper[4711]: E1203 12:16:42.817326 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:42 crc kubenswrapper[4711]: I1203 12:16:42.817388 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:42 crc kubenswrapper[4711]: E1203 12:16:42.817625 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:42 crc kubenswrapper[4711]: E1203 12:16:42.817761 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:43 crc kubenswrapper[4711]: I1203 12:16:43.816309 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:43 crc kubenswrapper[4711]: E1203 12:16:43.816428 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:44 crc kubenswrapper[4711]: I1203 12:16:44.816252 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:44 crc kubenswrapper[4711]: I1203 12:16:44.816310 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:44 crc kubenswrapper[4711]: E1203 12:16:44.816466 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:44 crc kubenswrapper[4711]: I1203 12:16:44.816551 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:44 crc kubenswrapper[4711]: E1203 12:16:44.816700 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:44 crc kubenswrapper[4711]: E1203 12:16:44.816837 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:45 crc kubenswrapper[4711]: I1203 12:16:45.816982 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:45 crc kubenswrapper[4711]: E1203 12:16:45.817571 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:45 crc kubenswrapper[4711]: I1203 12:16:45.818075 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:16:45 crc kubenswrapper[4711]: E1203 12:16:45.818360 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ct6xt_openshift-ovn-kubernetes(33d2332f-fdac-42be-891e-7eaef0e7ca9d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" Dec 03 12:16:46 crc kubenswrapper[4711]: I1203 12:16:46.816374 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:46 crc kubenswrapper[4711]: I1203 12:16:46.816428 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:46 crc kubenswrapper[4711]: E1203 12:16:46.816498 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:46 crc kubenswrapper[4711]: I1203 12:16:46.816448 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:46 crc kubenswrapper[4711]: E1203 12:16:46.816575 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:46 crc kubenswrapper[4711]: E1203 12:16:46.816676 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:46 crc kubenswrapper[4711]: E1203 12:16:46.911030 4711 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 12:16:47 crc kubenswrapper[4711]: I1203 12:16:47.816206 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:47 crc kubenswrapper[4711]: E1203 12:16:47.816336 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:48 crc kubenswrapper[4711]: I1203 12:16:48.816534 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:48 crc kubenswrapper[4711]: E1203 12:16:48.817301 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:48 crc kubenswrapper[4711]: I1203 12:16:48.816576 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:48 crc kubenswrapper[4711]: E1203 12:16:48.817546 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:48 crc kubenswrapper[4711]: I1203 12:16:48.816556 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:48 crc kubenswrapper[4711]: E1203 12:16:48.817808 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:48 crc kubenswrapper[4711]: I1203 12:16:48.817117 4711 scope.go:117] "RemoveContainer" containerID="0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093" Dec 03 12:16:49 crc kubenswrapper[4711]: I1203 12:16:49.413256 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/1.log" Dec 03 12:16:49 crc kubenswrapper[4711]: I1203 12:16:49.413574 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwhcr" event={"ID":"216c3ac8-462c-49ec-87a2-c935d0c4ad25","Type":"ContainerStarted","Data":"1535e90b589a87c3e19fcbcae492e601aaffa008889481cd7a1ffad7b23a76ae"} Dec 03 12:16:49 crc kubenswrapper[4711]: I1203 12:16:49.817152 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:49 crc kubenswrapper[4711]: E1203 12:16:49.817412 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:50 crc kubenswrapper[4711]: I1203 12:16:50.817127 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:50 crc kubenswrapper[4711]: I1203 12:16:50.817216 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:50 crc kubenswrapper[4711]: I1203 12:16:50.817127 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:50 crc kubenswrapper[4711]: E1203 12:16:50.817314 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:50 crc kubenswrapper[4711]: E1203 12:16:50.817461 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:50 crc kubenswrapper[4711]: E1203 12:16:50.817535 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:51 crc kubenswrapper[4711]: I1203 12:16:51.816989 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:51 crc kubenswrapper[4711]: E1203 12:16:51.818278 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:51 crc kubenswrapper[4711]: E1203 12:16:51.911664 4711 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 12:16:52 crc kubenswrapper[4711]: I1203 12:16:52.816771 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:52 crc kubenswrapper[4711]: I1203 12:16:52.816808 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:52 crc kubenswrapper[4711]: I1203 12:16:52.816876 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:52 crc kubenswrapper[4711]: E1203 12:16:52.816941 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:52 crc kubenswrapper[4711]: E1203 12:16:52.817051 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:52 crc kubenswrapper[4711]: E1203 12:16:52.817235 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:53 crc kubenswrapper[4711]: I1203 12:16:53.816688 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:53 crc kubenswrapper[4711]: E1203 12:16:53.816900 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:54 crc kubenswrapper[4711]: I1203 12:16:54.816319 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:54 crc kubenswrapper[4711]: E1203 12:16:54.816494 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:54 crc kubenswrapper[4711]: I1203 12:16:54.816779 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:54 crc kubenswrapper[4711]: E1203 12:16:54.816865 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:54 crc kubenswrapper[4711]: I1203 12:16:54.817207 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:54 crc kubenswrapper[4711]: E1203 12:16:54.817401 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:55 crc kubenswrapper[4711]: I1203 12:16:55.816950 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:55 crc kubenswrapper[4711]: E1203 12:16:55.817089 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:56 crc kubenswrapper[4711]: I1203 12:16:56.817133 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:56 crc kubenswrapper[4711]: I1203 12:16:56.817188 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:56 crc kubenswrapper[4711]: E1203 12:16:56.817370 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:56 crc kubenswrapper[4711]: I1203 12:16:56.817390 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:56 crc kubenswrapper[4711]: E1203 12:16:56.817511 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:56 crc kubenswrapper[4711]: E1203 12:16:56.817550 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:16:56 crc kubenswrapper[4711]: I1203 12:16:56.818236 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:16:56 crc kubenswrapper[4711]: E1203 12:16:56.913130 4711 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 12:16:57 crc kubenswrapper[4711]: I1203 12:16:57.816737 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:57 crc kubenswrapper[4711]: E1203 12:16:57.816974 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:58 crc kubenswrapper[4711]: I1203 12:16:58.129523 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wd9tz"] Dec 03 12:16:58 crc kubenswrapper[4711]: I1203 12:16:58.447253 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/3.log" Dec 03 12:16:58 crc kubenswrapper[4711]: I1203 12:16:58.451724 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerStarted","Data":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} Dec 03 12:16:58 crc kubenswrapper[4711]: I1203 12:16:58.451754 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:16:58 crc kubenswrapper[4711]: E1203 12:16:58.452015 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:16:58 crc kubenswrapper[4711]: I1203 12:16:58.452592 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:16:58 crc kubenswrapper[4711]: I1203 12:16:58.816953 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:58 crc kubenswrapper[4711]: I1203 12:16:58.817063 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:16:58 crc kubenswrapper[4711]: E1203 12:16:58.817126 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:16:58 crc kubenswrapper[4711]: E1203 12:16:58.817166 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:16:58 crc kubenswrapper[4711]: I1203 12:16:58.817203 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:16:58 crc kubenswrapper[4711]: E1203 12:16:58.817283 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:17:00 crc kubenswrapper[4711]: I1203 12:17:00.816396 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:17:00 crc kubenswrapper[4711]: I1203 12:17:00.816472 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:17:00 crc kubenswrapper[4711]: I1203 12:17:00.816493 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:17:00 crc kubenswrapper[4711]: I1203 12:17:00.816504 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:17:00 crc kubenswrapper[4711]: E1203 12:17:00.817215 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd9tz" podUID="cdb7f01e-b2fd-49da-b7de-621da238d797" Dec 03 12:17:00 crc kubenswrapper[4711]: E1203 12:17:00.817429 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:17:00 crc kubenswrapper[4711]: E1203 12:17:00.817550 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:17:00 crc kubenswrapper[4711]: E1203 12:17:00.817656 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.816730 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.816832 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.816895 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.817034 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.823154 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.823215 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.825848 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.826098 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.828226 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 12:17:02 crc kubenswrapper[4711]: I1203 12:17:02.829374 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 12:17:05 crc kubenswrapper[4711]: I1203 12:17:05.402055 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:17:05 crc kubenswrapper[4711]: I1203 12:17:05.403980 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.762312 4711 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.805529 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podStartSLOduration=125.805494386 podStartE2EDuration="2m5.805494386s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:58.501691318 +0000 UTC m=+137.170942573" watchObservedRunningTime="2025-12-03 12:17:07.805494386 +0000 UTC m=+146.474745681" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.806501 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gx6sm"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.808163 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.811793 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.826642 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.827088 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.827262 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.827291 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.827531 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.828364 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.836634 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hnrkf"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.837143 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtc2k"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.837387 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.837629 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5c4b6"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.837898 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bfrlg"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.838247 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.838538 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.838696 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.839642 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.840219 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.840969 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.841903 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.842577 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.843513 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-x6jjm"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.843887 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x6jjm" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.844028 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.844784 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.852012 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.852740 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.853281 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.853598 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zpvkj"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.853961 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.855893 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859996 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.856274 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.856499 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.856604 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.856740 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.856859 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.860501 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.857743 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.857978 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858049 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858096 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858140 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858185 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858191 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858246 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858286 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858294 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858323 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858328 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.861174 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.861212 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858417 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858501 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858519 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858562 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.861510 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858602 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858647 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858715 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858727 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858742 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858788 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858800 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858824 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858866 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858928 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858969 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.858985 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859390 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859449 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859503 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859567 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859613 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859658 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859704 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859796 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859860 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.859936 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.862055 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.863151 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.866075 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.877440 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-fkpkw"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.878017 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-59vns"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.878136 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.878473 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.878654 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtc2k"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.878681 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gx6sm"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.878781 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.903068 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hnrkf"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.903124 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.914829 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.923300 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm68r"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.923509 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.928028 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.928275 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.929064 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.929076 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg5qx\" (UniqueName: \"kubernetes.io/projected/96380b31-f010-408a-b4b2-af721875ec8c-kube-api-access-gg5qx\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.929144 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.929199 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.929799 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-config\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.929895 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e8a48c-f60d-494e-8928-061c3235d3f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.929978 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f49br\" (UniqueName: \"kubernetes.io/projected/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-kube-api-access-f49br\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930018 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-audit-dir\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930060 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930239 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930271 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c142bf-dba2-4cc2-bf4d-13688d6033b0-config\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930320 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdlq8\" (UniqueName: \"kubernetes.io/projected/a0e8a48c-f60d-494e-8928-061c3235d3f1-kube-api-access-wdlq8\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930346 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-encryption-config\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930499 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930544 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-oauth-serving-cert\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930618 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-config\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930719 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96380b31-f010-408a-b4b2-af721875ec8c-console-oauth-config\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930750 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-client-ca\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930771 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-serving-cert\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930826 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/94c142bf-dba2-4cc2-bf4d-13688d6033b0-auth-proxy-config\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930854 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-service-ca-bundle\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930880 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f67c1b42-653d-46be-8cf8-d68fe0541f3e-trusted-ca\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930968 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-service-ca\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.930984 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67c1b42-653d-46be-8cf8-d68fe0541f3e-serving-cert\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931000 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-audit-policies\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931024 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96380b31-f010-408a-b4b2-af721875ec8c-console-serving-cert\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931040 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f274w\" (UniqueName: \"kubernetes.io/projected/f67c1b42-653d-46be-8cf8-d68fe0541f3e-kube-api-access-f274w\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931062 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f67c1b42-653d-46be-8cf8-d68fe0541f3e-config\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931079 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rm56\" (UniqueName: \"kubernetes.io/projected/94c142bf-dba2-4cc2-bf4d-13688d6033b0-kube-api-access-6rm56\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931103 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-console-config\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931125 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-trusted-ca-bundle\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931140 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62513aeb-7bbf-4873-905a-0e52f11618ed-serving-cert\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931168 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fkb\" (UniqueName: \"kubernetes.io/projected/62513aeb-7bbf-4873-905a-0e52f11618ed-kube-api-access-f6fkb\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931190 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-etcd-client\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931219 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/94c142bf-dba2-4cc2-bf4d-13688d6033b0-machine-approver-tls\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931701 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931780 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.931974 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.932072 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.932178 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.932272 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.932411 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.932442 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.932541 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.932713 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-45rj8"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.932787 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.933549 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.933831 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.934828 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.934953 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.935671 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c8g7k"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.935897 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.936255 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mjzpn"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.936546 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.941717 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.942214 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.942460 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.942931 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.943293 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.943648 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.944028 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.943163 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.948351 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.949655 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.949996 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.950597 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.955783 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.955993 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.956618 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.956654 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.958688 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.959546 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.961271 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.961681 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.961791 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sjc5j"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.962130 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.962282 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.962343 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.962528 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.962536 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.962665 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.962293 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.963054 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.963243 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.963297 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.963361 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.963589 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.963995 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.978524 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.980266 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.981732 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.983942 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb"] Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.988048 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.989323 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.990834 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.991162 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.991278 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.991395 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.991718 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.991934 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.992840 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.993431 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 12:17:07 crc kubenswrapper[4711]: I1203 12:17:07.996531 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.004399 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.004884 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.005173 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.005451 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.005680 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5f942"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.006138 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.006186 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.006521 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.006742 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.007217 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.011278 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.014429 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5c4b6"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.014766 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.017112 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.018284 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.025405 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.026088 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ng58r"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.026462 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.026725 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.027710 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x6jjm"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.030945 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.030989 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031621 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031670 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-trusted-ca-bundle\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031700 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62513aeb-7bbf-4873-905a-0e52f11618ed-serving-cert\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031726 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-node-pullsecrets\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031726 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bfrlg"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031750 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tkkz\" (UniqueName: \"kubernetes.io/projected/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-kube-api-access-6tkkz\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031774 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031797 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-audit-dir\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031822 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fkb\" (UniqueName: \"kubernetes.io/projected/62513aeb-7bbf-4873-905a-0e52f11618ed-kube-api-access-f6fkb\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031843 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09fbd47f-ad50-4d45-8505-f73dd228dd58-serving-cert\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031865 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064e5e0c-68f0-4d05-b054-17948a298623-config\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031886 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe2d0a48-2982-4c9b-945b-940c24cda7ff-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r72zb\" (UID: \"fe2d0a48-2982-4c9b-945b-940c24cda7ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031931 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-etcd-client\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031954 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-encryption-config\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031975 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9krlj\" (UniqueName: \"kubernetes.io/projected/25c69a31-2a54-4077-9b8c-c859d1d20849-kube-api-access-9krlj\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.031997 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2f826dc-7227-4d0d-844c-4b84a5902d7b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032018 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f826dc-7227-4d0d-844c-4b84a5902d7b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032064 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/94c142bf-dba2-4cc2-bf4d-13688d6033b0-machine-approver-tls\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032087 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg5qx\" (UniqueName: \"kubernetes.io/projected/96380b31-f010-408a-b4b2-af721875ec8c-kube-api-access-gg5qx\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032109 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-client-ca\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032136 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-config\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032158 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mf9p\" (UniqueName: \"kubernetes.io/projected/09fbd47f-ad50-4d45-8505-f73dd228dd58-kube-api-access-5mf9p\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032178 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4lcl\" (UniqueName: \"kubernetes.io/projected/c2f826dc-7227-4d0d-844c-4b84a5902d7b-kube-api-access-w4lcl\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032204 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e8a48c-f60d-494e-8928-061c3235d3f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032226 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f49br\" (UniqueName: \"kubernetes.io/projected/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-kube-api-access-f49br\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032251 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-audit-dir\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032277 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032298 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-etcd-serving-ca\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032322 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032344 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032363 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c142bf-dba2-4cc2-bf4d-13688d6033b0-config\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032383 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032404 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-policies\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032426 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdlq8\" (UniqueName: \"kubernetes.io/projected/a0e8a48c-f60d-494e-8928-061c3235d3f1-kube-api-access-wdlq8\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032446 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-encryption-config\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032467 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09fbd47f-ad50-4d45-8505-f73dd228dd58-available-featuregates\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032489 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-dir\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032511 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9a71eaa-37ba-4311-8f1f-30b4580cb030-proxy-tls\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032534 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032558 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032591 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-oauth-serving-cert\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032613 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-config\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032637 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/064e5e0c-68f0-4d05-b054-17948a298623-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032661 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032684 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9jv\" (UniqueName: \"kubernetes.io/projected/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-kube-api-access-vq9jv\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032707 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032728 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-etcd-client\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032749 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-client-ca\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032771 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96380b31-f010-408a-b4b2-af721875ec8c-console-oauth-config\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032794 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032817 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-serving-cert\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032839 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxhv\" (UniqueName: \"kubernetes.io/projected/fe2d0a48-2982-4c9b-945b-940c24cda7ff-kube-api-access-9cxhv\") pod \"cluster-samples-operator-665b6dd947-r72zb\" (UID: \"fe2d0a48-2982-4c9b-945b-940c24cda7ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032860 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-image-import-ca\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032883 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/94c142bf-dba2-4cc2-bf4d-13688d6033b0-auth-proxy-config\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032909 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-296hh\" (UniqueName: \"kubernetes.io/projected/e9a71eaa-37ba-4311-8f1f-30b4580cb030-kube-api-access-296hh\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032967 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-service-ca-bundle\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.032991 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033013 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-serving-cert\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033046 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f67c1b42-653d-46be-8cf8-d68fe0541f3e-trusted-ca\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033066 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/064e5e0c-68f0-4d05-b054-17948a298623-images\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033087 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9a71eaa-37ba-4311-8f1f-30b4580cb030-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033108 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-audit\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033128 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033159 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-service-ca\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033180 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67c1b42-653d-46be-8cf8-d68fe0541f3e-serving-cert\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033201 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-audit-policies\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033224 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhwsj\" (UniqueName: \"kubernetes.io/projected/064e5e0c-68f0-4d05-b054-17948a298623-kube-api-access-vhwsj\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033247 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033267 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-config\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033286 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-config\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033309 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96380b31-f010-408a-b4b2-af721875ec8c-console-serving-cert\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033332 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f274w\" (UniqueName: \"kubernetes.io/projected/f67c1b42-653d-46be-8cf8-d68fe0541f3e-kube-api-access-f274w\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033352 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033375 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f67c1b42-653d-46be-8cf8-d68fe0541f3e-config\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033400 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rm56\" (UniqueName: \"kubernetes.io/projected/94c142bf-dba2-4cc2-bf4d-13688d6033b0-kube-api-access-6rm56\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033420 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdgrv\" (UniqueName: \"kubernetes.io/projected/8cc669b1-f54c-476c-ae6c-81f5bf982b7e-kube-api-access-wdgrv\") pod \"downloads-7954f5f757-x6jjm\" (UID: \"8cc669b1-f54c-476c-ae6c-81f5bf982b7e\") " pod="openshift-console/downloads-7954f5f757-x6jjm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033441 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-console-config\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033463 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033482 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c69a31-2a54-4077-9b8c-c859d1d20849-serving-cert\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.033980 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-trusted-ca-bundle\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.034847 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-oauth-serving-cert\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.038705 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.038765 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-59vns"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.039292 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-console-config\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.039774 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96380b31-f010-408a-b4b2-af721875ec8c-service-ca\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.040698 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f67c1b42-653d-46be-8cf8-d68fe0541f3e-config\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.040789 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-config\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.041476 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/94c142bf-dba2-4cc2-bf4d-13688d6033b0-auth-proxy-config\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.041509 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-service-ca-bundle\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.041685 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.041730 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-audit-policies\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.041827 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.042234 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f67c1b42-653d-46be-8cf8-d68fe0541f3e-trusted-ca\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.042640 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-audit-dir\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.042641 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-etcd-client\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.043290 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62513aeb-7bbf-4873-905a-0e52f11618ed-config\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.043420 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-client-ca\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.044145 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62513aeb-7bbf-4873-905a-0e52f11618ed-serving-cert\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.044206 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.044150 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96380b31-f010-408a-b4b2-af721875ec8c-console-serving-cert\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.044994 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.047088 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-encryption-config\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.047353 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/94c142bf-dba2-4cc2-bf4d-13688d6033b0-machine-approver-tls\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.047756 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67c1b42-653d-46be-8cf8-d68fe0541f3e-serving-cert\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.048518 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e8a48c-f60d-494e-8928-061c3235d3f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.055047 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c142bf-dba2-4cc2-bf4d-13688d6033b0-config\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.058633 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96380b31-f010-408a-b4b2-af721875ec8c-console-oauth-config\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.058748 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fh86s"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.059813 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fkpkw"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.059957 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.060307 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.060556 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.063364 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.065640 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm68r"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.067656 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-serving-cert\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.068012 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.070354 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zpvkj"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.070802 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.071903 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.073350 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.074719 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sjc5j"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.076146 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.077563 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.079017 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mjzpn"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.081110 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.082551 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.084030 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-45rj8"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.085549 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c8g7k"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.087470 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.088863 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fh86s"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.098450 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.101784 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9zqgp"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.103888 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.105493 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hxjp8"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.107372 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hxjp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.108041 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.111785 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.112526 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.114953 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.116227 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.117495 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hxjp8"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.120626 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5f942"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.122149 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.123826 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.125396 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.127452 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.129234 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bs88v"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.130532 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.130801 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.132270 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bs88v"] Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134394 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-policies\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134423 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-etcd-serving-ca\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134439 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-dir\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134457 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9a71eaa-37ba-4311-8f1f-30b4580cb030-proxy-tls\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134476 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ec52f86-45ff-4a9d-8913-a8848e0589e7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-66k2k\" (UID: \"9ec52f86-45ff-4a9d-8913-a8848e0589e7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134496 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09fbd47f-ad50-4d45-8505-f73dd228dd58-available-featuregates\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134515 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9e2da1c-8e56-47d1-930e-6afacb8839e1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134539 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134557 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9jv\" (UniqueName: \"kubernetes.io/projected/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-kube-api-access-vq9jv\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134576 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/064e5e0c-68f0-4d05-b054-17948a298623-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134595 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9cdc9430-275b-4891-858d-a68d9148058e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-45rj8\" (UID: \"9cdc9430-275b-4891-858d-a68d9148058e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134609 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-serving-cert\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134624 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134646 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b25215d-cee5-46c3-8008-d9165ac32b96-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134664 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e00519d7-eaff-4176-a850-486dbb03e0cc-profile-collector-cert\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134679 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq7rh\" (UniqueName: \"kubernetes.io/projected/e00519d7-eaff-4176-a850-486dbb03e0cc-kube-api-access-kq7rh\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134695 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-stats-auth\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134712 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3095a2e-f5c5-436e-953a-1fb6ae1950bb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zhfzk\" (UID: \"f3095a2e-f5c5-436e-953a-1fb6ae1950bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134728 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e00519d7-eaff-4176-a850-486dbb03e0cc-srv-cert\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134744 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94tcf\" (UniqueName: \"kubernetes.io/projected/9cdc9430-275b-4891-858d-a68d9148058e-kube-api-access-94tcf\") pod \"multus-admission-controller-857f4d67dd-45rj8\" (UID: \"9cdc9430-275b-4891-858d-a68d9148058e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134758 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhxh8\" (UniqueName: \"kubernetes.io/projected/9bc96e8b-e5df-40a7-8690-530895999a16-kube-api-access-jhxh8\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134773 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04039569-bc9d-4bbe-972a-235add83f1b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134797 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134814 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwg5t\" (UniqueName: \"kubernetes.io/projected/9ec52f86-45ff-4a9d-8913-a8848e0589e7-kube-api-access-gwg5t\") pod \"package-server-manager-789f6589d5-66k2k\" (UID: \"9ec52f86-45ff-4a9d-8913-a8848e0589e7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134829 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10e059c5-9cf9-46f7-b31a-b97aae40f17a-metrics-tls\") pod \"dns-operator-744455d44c-mjzpn\" (UID: \"10e059c5-9cf9-46f7-b31a-b97aae40f17a\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134847 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7730e767-9d1f-4ac7-867a-08760f9d1990-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134866 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42bb9e00-bc53-46ba-96d6-e38a92694817-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134889 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9e2da1c-8e56-47d1-930e-6afacb8839e1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134910 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhwsj\" (UniqueName: \"kubernetes.io/projected/064e5e0c-68f0-4d05-b054-17948a298623-kube-api-access-vhwsj\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134956 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-config\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.134974 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135002 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135018 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6jxb\" (UniqueName: \"kubernetes.io/projected/b9e2da1c-8e56-47d1-930e-6afacb8839e1-kube-api-access-p6jxb\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135034 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5m9v\" (UniqueName: \"kubernetes.io/projected/f3095a2e-f5c5-436e-953a-1fb6ae1950bb-kube-api-access-h5m9v\") pod \"control-plane-machine-set-operator-78cbb6b69f-zhfzk\" (UID: \"f3095a2e-f5c5-436e-953a-1fb6ae1950bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135052 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sk2j\" (UniqueName: \"kubernetes.io/projected/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-kube-api-access-2sk2j\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135069 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-service-ca-bundle\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135118 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14ef9276-a825-478b-bee0-674657696c8e-srv-cert\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135139 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c69a31-2a54-4077-9b8c-c859d1d20849-serving-cert\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135158 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqs9\" (UniqueName: \"kubernetes.io/projected/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-kube-api-access-6kqs9\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135175 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tkkz\" (UniqueName: \"kubernetes.io/projected/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-kube-api-access-6tkkz\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135194 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9h2\" (UniqueName: \"kubernetes.io/projected/09fb9b94-8286-4064-a6f5-15362c4f300d-kube-api-access-zh9h2\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135210 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1cdfc9ef-2385-43ab-ad70-912c119522c7-tmpfs\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135226 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09fbd47f-ad50-4d45-8505-f73dd228dd58-serving-cert\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135224 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-etcd-serving-ca\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135243 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064e5e0c-68f0-4d05-b054-17948a298623-config\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135260 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-audit-dir\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135276 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135293 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-encryption-config\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135309 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe2d0a48-2982-4c9b-945b-940c24cda7ff-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r72zb\" (UID: \"fe2d0a48-2982-4c9b-945b-940c24cda7ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135326 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blvj9\" (UniqueName: \"kubernetes.io/projected/1cdfc9ef-2385-43ab-ad70-912c119522c7-kube-api-access-blvj9\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135343 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba64eaf-060d-417e-8831-f67b27012082-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135362 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9krlj\" (UniqueName: \"kubernetes.io/projected/25c69a31-2a54-4077-9b8c-c859d1d20849-kube-api-access-9krlj\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135377 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2f826dc-7227-4d0d-844c-4b84a5902d7b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135395 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f826dc-7227-4d0d-844c-4b84a5902d7b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135412 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135444 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04039569-bc9d-4bbe-972a-235add83f1b8-config\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135467 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktx6\" (UniqueName: \"kubernetes.io/projected/6ba64eaf-060d-417e-8831-f67b27012082-kube-api-access-mktx6\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135485 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-829fm\" (UniqueName: \"kubernetes.io/projected/10e059c5-9cf9-46f7-b31a-b97aae40f17a-kube-api-access-829fm\") pod \"dns-operator-744455d44c-mjzpn\" (UID: \"10e059c5-9cf9-46f7-b31a-b97aae40f17a\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135503 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mf9p\" (UniqueName: \"kubernetes.io/projected/09fbd47f-ad50-4d45-8505-f73dd228dd58-kube-api-access-5mf9p\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135522 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4lcl\" (UniqueName: \"kubernetes.io/projected/c2f826dc-7227-4d0d-844c-4b84a5902d7b-kube-api-access-w4lcl\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135539 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjnsj\" (UniqueName: \"kubernetes.io/projected/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-kube-api-access-gjnsj\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135556 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-default-certificate\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135573 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135597 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135611 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-config-volume\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135629 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8fcw\" (UniqueName: \"kubernetes.io/projected/14ef9276-a825-478b-bee0-674657696c8e-kube-api-access-k8fcw\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135647 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135669 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m4m7\" (UniqueName: \"kubernetes.io/projected/2975f7c8-1440-4eac-bccb-8746f327f768-kube-api-access-5m4m7\") pod \"migrator-59844c95c7-zzqqb\" (UID: \"2975f7c8-1440-4eac-bccb-8746f327f768\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135688 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42bb9e00-bc53-46ba-96d6-e38a92694817-images\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135739 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135760 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-etcd-client\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135778 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxhv\" (UniqueName: \"kubernetes.io/projected/fe2d0a48-2982-4c9b-945b-940c24cda7ff-kube-api-access-9cxhv\") pod \"cluster-samples-operator-665b6dd947-r72zb\" (UID: \"fe2d0a48-2982-4c9b-945b-940c24cda7ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135797 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-secret-volume\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135816 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-image-import-ca\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135814 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-policies\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135840 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14ef9276-a825-478b-bee0-674657696c8e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135863 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba64eaf-060d-417e-8831-f67b27012082-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135891 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-296hh\" (UniqueName: \"kubernetes.io/projected/e9a71eaa-37ba-4311-8f1f-30b4580cb030-kube-api-access-296hh\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135930 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e5bc284-86b9-486f-aff2-5f277e323011-trusted-ca\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135955 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04039569-bc9d-4bbe-972a-235add83f1b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135980 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/064e5e0c-68f0-4d05-b054-17948a298623-images\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.135995 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-serving-cert\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136012 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e5bc284-86b9-486f-aff2-5f277e323011-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136031 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9a71eaa-37ba-4311-8f1f-30b4580cb030-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136048 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-audit\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136072 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136094 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-webhook-cert\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136113 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136128 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-config\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136149 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdgrv\" (UniqueName: \"kubernetes.io/projected/8cc669b1-f54c-476c-ae6c-81f5bf982b7e-kube-api-access-wdgrv\") pod \"downloads-7954f5f757-x6jjm\" (UID: \"8cc669b1-f54c-476c-ae6c-81f5bf982b7e\") " pod="openshift-console/downloads-7954f5f757-x6jjm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136169 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136187 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136207 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtxsk\" (UniqueName: \"kubernetes.io/projected/8b25215d-cee5-46c3-8008-d9165ac32b96-kube-api-access-wtxsk\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136225 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136244 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42bb9e00-bc53-46ba-96d6-e38a92694817-proxy-tls\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136263 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-node-pullsecrets\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136280 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136329 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqzfw\" (UniqueName: \"kubernetes.io/projected/8e5bc284-86b9-486f-aff2-5f277e323011-kube-api-access-qqzfw\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136355 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136372 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09fb9b94-8286-4064-a6f5-15362c4f300d-signing-cabundle\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136389 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e5bc284-86b9-486f-aff2-5f277e323011-metrics-tls\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136379 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09fbd47f-ad50-4d45-8505-f73dd228dd58-available-featuregates\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136408 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7730e767-9d1f-4ac7-867a-08760f9d1990-config\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136505 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmx8\" (UniqueName: \"kubernetes.io/projected/42bb9e00-bc53-46ba-96d6-e38a92694817-kube-api-access-bxmx8\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136557 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09fb9b94-8286-4064-a6f5-15362c4f300d-signing-key\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136599 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9e2da1c-8e56-47d1-930e-6afacb8839e1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136634 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-metrics-certs\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136665 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b25215d-cee5-46c3-8008-d9165ac32b96-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136724 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-client-ca\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136764 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7730e767-9d1f-4ac7-867a-08760f9d1990-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136803 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-config\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.136855 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.137276 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-dir\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.137879 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.138156 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.138250 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.138383 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-image-import-ca\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.138519 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9a71eaa-37ba-4311-8f1f-30b4580cb030-proxy-tls\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.138616 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.139134 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-node-pullsecrets\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.139170 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-config\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.139627 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.140100 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/064e5e0c-68f0-4d05-b054-17948a298623-images\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.140264 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-audit-dir\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.140308 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9a71eaa-37ba-4311-8f1f-30b4580cb030-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.140357 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.140461 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-audit\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.140537 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f826dc-7227-4d0d-844c-4b84a5902d7b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.141376 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064e5e0c-68f0-4d05-b054-17948a298623-config\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.141522 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-client-ca\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.141886 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.142368 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-etcd-client\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.142385 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.142796 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-config\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.144003 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-encryption-config\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.144269 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2f826dc-7227-4d0d-844c-4b84a5902d7b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.144306 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09fbd47f-ad50-4d45-8505-f73dd228dd58-serving-cert\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.144445 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-serving-cert\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.144538 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.144894 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.145527 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.146219 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.146402 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/064e5e0c-68f0-4d05-b054-17948a298623-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.148505 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.150536 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.156708 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c69a31-2a54-4077-9b8c-c859d1d20849-serving-cert\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.157493 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe2d0a48-2982-4c9b-945b-940c24cda7ff-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r72zb\" (UID: \"fe2d0a48-2982-4c9b-945b-940c24cda7ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.169932 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.196128 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.210856 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.231185 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237670 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237709 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtxsk\" (UniqueName: \"kubernetes.io/projected/8b25215d-cee5-46c3-8008-d9165ac32b96-kube-api-access-wtxsk\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237756 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42bb9e00-bc53-46ba-96d6-e38a92694817-proxy-tls\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237773 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237798 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09fb9b94-8286-4064-a6f5-15362c4f300d-signing-cabundle\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237815 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqzfw\" (UniqueName: \"kubernetes.io/projected/8e5bc284-86b9-486f-aff2-5f277e323011-kube-api-access-qqzfw\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237833 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e5bc284-86b9-486f-aff2-5f277e323011-metrics-tls\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237852 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7730e767-9d1f-4ac7-867a-08760f9d1990-config\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237868 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmx8\" (UniqueName: \"kubernetes.io/projected/42bb9e00-bc53-46ba-96d6-e38a92694817-kube-api-access-bxmx8\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.237885 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09fb9b94-8286-4064-a6f5-15362c4f300d-signing-key\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238066 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9e2da1c-8e56-47d1-930e-6afacb8839e1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238091 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-metrics-certs\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238183 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b25215d-cee5-46c3-8008-d9165ac32b96-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238203 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7730e767-9d1f-4ac7-867a-08760f9d1990-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238299 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-config\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238371 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ec52f86-45ff-4a9d-8913-a8848e0589e7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-66k2k\" (UID: \"9ec52f86-45ff-4a9d-8913-a8848e0589e7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238405 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9e2da1c-8e56-47d1-930e-6afacb8839e1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238429 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9cdc9430-275b-4891-858d-a68d9148058e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-45rj8\" (UID: \"9cdc9430-275b-4891-858d-a68d9148058e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238445 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-serving-cert\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238462 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238480 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b25215d-cee5-46c3-8008-d9165ac32b96-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238497 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-stats-auth\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238516 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3095a2e-f5c5-436e-953a-1fb6ae1950bb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zhfzk\" (UID: \"f3095a2e-f5c5-436e-953a-1fb6ae1950bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238538 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e00519d7-eaff-4176-a850-486dbb03e0cc-srv-cert\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238560 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e00519d7-eaff-4176-a850-486dbb03e0cc-profile-collector-cert\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238579 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq7rh\" (UniqueName: \"kubernetes.io/projected/e00519d7-eaff-4176-a850-486dbb03e0cc-kube-api-access-kq7rh\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238597 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94tcf\" (UniqueName: \"kubernetes.io/projected/9cdc9430-275b-4891-858d-a68d9148058e-kube-api-access-94tcf\") pod \"multus-admission-controller-857f4d67dd-45rj8\" (UID: \"9cdc9430-275b-4891-858d-a68d9148058e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238616 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhxh8\" (UniqueName: \"kubernetes.io/projected/9bc96e8b-e5df-40a7-8690-530895999a16-kube-api-access-jhxh8\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238640 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04039569-bc9d-4bbe-972a-235add83f1b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238664 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwg5t\" (UniqueName: \"kubernetes.io/projected/9ec52f86-45ff-4a9d-8913-a8848e0589e7-kube-api-access-gwg5t\") pod \"package-server-manager-789f6589d5-66k2k\" (UID: \"9ec52f86-45ff-4a9d-8913-a8848e0589e7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238682 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10e059c5-9cf9-46f7-b31a-b97aae40f17a-metrics-tls\") pod \"dns-operator-744455d44c-mjzpn\" (UID: \"10e059c5-9cf9-46f7-b31a-b97aae40f17a\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238699 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7730e767-9d1f-4ac7-867a-08760f9d1990-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238715 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42bb9e00-bc53-46ba-96d6-e38a92694817-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238741 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9e2da1c-8e56-47d1-930e-6afacb8839e1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238757 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.238783 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6jxb\" (UniqueName: \"kubernetes.io/projected/b9e2da1c-8e56-47d1-930e-6afacb8839e1-kube-api-access-p6jxb\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239132 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5m9v\" (UniqueName: \"kubernetes.io/projected/f3095a2e-f5c5-436e-953a-1fb6ae1950bb-kube-api-access-h5m9v\") pod \"control-plane-machine-set-operator-78cbb6b69f-zhfzk\" (UID: \"f3095a2e-f5c5-436e-953a-1fb6ae1950bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239247 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sk2j\" (UniqueName: \"kubernetes.io/projected/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-kube-api-access-2sk2j\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239273 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-service-ca-bundle\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239382 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14ef9276-a825-478b-bee0-674657696c8e-srv-cert\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239400 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqs9\" (UniqueName: \"kubernetes.io/projected/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-kube-api-access-6kqs9\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239423 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9h2\" (UniqueName: \"kubernetes.io/projected/09fb9b94-8286-4064-a6f5-15362c4f300d-kube-api-access-zh9h2\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239432 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42bb9e00-bc53-46ba-96d6-e38a92694817-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239441 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239503 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1cdfc9ef-2385-43ab-ad70-912c119522c7-tmpfs\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239527 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blvj9\" (UniqueName: \"kubernetes.io/projected/1cdfc9ef-2385-43ab-ad70-912c119522c7-kube-api-access-blvj9\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239545 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba64eaf-060d-417e-8831-f67b27012082-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239562 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239614 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktx6\" (UniqueName: \"kubernetes.io/projected/6ba64eaf-060d-417e-8831-f67b27012082-kube-api-access-mktx6\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239631 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04039569-bc9d-4bbe-972a-235add83f1b8-config\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239650 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-829fm\" (UniqueName: \"kubernetes.io/projected/10e059c5-9cf9-46f7-b31a-b97aae40f17a-kube-api-access-829fm\") pod \"dns-operator-744455d44c-mjzpn\" (UID: \"10e059c5-9cf9-46f7-b31a-b97aae40f17a\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239679 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjnsj\" (UniqueName: \"kubernetes.io/projected/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-kube-api-access-gjnsj\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239695 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-default-certificate\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239721 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-config-volume\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239740 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m4m7\" (UniqueName: \"kubernetes.io/projected/2975f7c8-1440-4eac-bccb-8746f327f768-kube-api-access-5m4m7\") pod \"migrator-59844c95c7-zzqqb\" (UID: \"2975f7c8-1440-4eac-bccb-8746f327f768\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239757 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8fcw\" (UniqueName: \"kubernetes.io/projected/14ef9276-a825-478b-bee0-674657696c8e-kube-api-access-k8fcw\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239777 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42bb9e00-bc53-46ba-96d6-e38a92694817-images\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239796 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-secret-volume\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239813 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14ef9276-a825-478b-bee0-674657696c8e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239830 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba64eaf-060d-417e-8831-f67b27012082-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239855 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e5bc284-86b9-486f-aff2-5f277e323011-trusted-ca\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239874 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04039569-bc9d-4bbe-972a-235add83f1b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239891 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e5bc284-86b9-486f-aff2-5f277e323011-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.239940 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-webhook-cert\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.240212 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.240715 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1cdfc9ef-2385-43ab-ad70-912c119522c7-tmpfs\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.241043 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42bb9e00-bc53-46ba-96d6-e38a92694817-images\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.241182 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42bb9e00-bc53-46ba-96d6-e38a92694817-proxy-tls\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.241703 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ec52f86-45ff-4a9d-8913-a8848e0589e7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-66k2k\" (UID: \"9ec52f86-45ff-4a9d-8913-a8848e0589e7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.242488 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9cdc9430-275b-4891-858d-a68d9148058e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-45rj8\" (UID: \"9cdc9430-275b-4891-858d-a68d9148058e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.242862 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.250962 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.270607 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.290540 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.312371 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.332429 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.351360 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.371027 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.382087 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10e059c5-9cf9-46f7-b31a-b97aae40f17a-metrics-tls\") pod \"dns-operator-744455d44c-mjzpn\" (UID: \"10e059c5-9cf9-46f7-b31a-b97aae40f17a\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.391411 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.410956 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.430281 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.451110 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.470240 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.483206 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-serving-cert\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.491055 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.510551 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.519546 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-config\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.530473 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.533939 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14ef9276-a825-478b-bee0-674657696c8e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.535259 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-secret-volume\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.544105 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e00519d7-eaff-4176-a850-486dbb03e0cc-profile-collector-cert\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.550866 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.563975 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e00519d7-eaff-4176-a850-486dbb03e0cc-srv-cert\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.571894 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.584515 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14ef9276-a825-478b-bee0-674657696c8e-srv-cert\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.598752 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.611184 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9e2da1c-8e56-47d1-930e-6afacb8839e1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.611699 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.622039 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9e2da1c-8e56-47d1-930e-6afacb8839e1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.632134 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.646533 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:08 crc kubenswrapper[4711]: E1203 12:17:08.646794 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:19:10.646754625 +0000 UTC m=+269.316005920 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.652398 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.671076 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.683366 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7730e767-9d1f-4ac7-867a-08760f9d1990-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.692017 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.699739 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7730e767-9d1f-4ac7-867a-08760f9d1990-config\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.710998 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.731822 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.748686 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.749016 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.749114 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.749312 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.750645 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.751196 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.753399 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.755616 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.755627 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.770676 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.790005 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.791307 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-config-volume\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.812316 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.832437 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.844149 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3095a2e-f5c5-436e-953a-1fb6ae1950bb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zhfzk\" (UID: \"f3095a2e-f5c5-436e-953a-1fb6ae1950bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.848883 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.851643 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.868431 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.870864 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.886328 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.891714 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.912178 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.922370 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b25215d-cee5-46c3-8008-d9165ac32b96-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.931895 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.941621 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b25215d-cee5-46c3-8008-d9165ac32b96-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.960297 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.971573 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.979118 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09fb9b94-8286-4064-a6f5-15362c4f300d-signing-cabundle\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:08 crc kubenswrapper[4711]: I1203 12:17:08.993044 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.009257 4711 request.go:700] Waited for 1.002346826s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dservice-ca-dockercfg-pn86c&limit=500&resourceVersion=0 Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.011335 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.030698 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.055722 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.064379 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09fb9b94-8286-4064-a6f5-15362c4f300d-signing-key\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.070226 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.081129 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba64eaf-060d-417e-8831-f67b27012082-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.090748 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.110721 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.130771 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 12:17:09 crc kubenswrapper[4711]: W1203 12:17:09.135078 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-845eab113f33ff3b8479278a594c0e334f43e79bde8808571031748b54187828 WatchSource:0}: Error finding container 845eab113f33ff3b8479278a594c0e334f43e79bde8808571031748b54187828: Status 404 returned error can't find the container with id 845eab113f33ff3b8479278a594c0e334f43e79bde8808571031748b54187828 Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.151460 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.162893 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04039569-bc9d-4bbe-972a-235add83f1b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.172094 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.181306 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04039569-bc9d-4bbe-972a-235add83f1b8-config\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.191053 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.210263 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.231462 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.238853 4711 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.238942 4711 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.238975 4711 secret.go:188] Couldn't get secret openshift-ingress-operator/metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239043 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-serving-cert podName:ab93fa35-7435-4886-bcd7-a13ca47bd8fd nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.739013791 +0000 UTC m=+148.408265076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-7brcf" (UID: "ab93fa35-7435-4886-bcd7-a13ca47bd8fd") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.238947 4711 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239076 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e5bc284-86b9-486f-aff2-5f277e323011-metrics-tls podName:8e5bc284-86b9-486f-aff2-5f277e323011 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.739060862 +0000 UTC m=+148.408312157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e5bc284-86b9-486f-aff2-5f277e323011-metrics-tls") pod "ingress-operator-5b745b69d9-nmlhj" (UID: "8e5bc284-86b9-486f-aff2-5f277e323011") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239122 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-metrics-certs podName:efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.739096883 +0000 UTC m=+148.408348138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-metrics-certs") pod "router-default-5444994796-ng58r" (UID: "efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239197 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-apiservice-cert podName:1cdfc9ef-2385-43ab-ad70-912c119522c7 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.739185826 +0000 UTC m=+148.408437071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-apiservice-cert") pod "packageserver-d55dfcdfc-9zjpk" (UID: "1cdfc9ef-2385-43ab-ad70-912c119522c7") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239520 4711 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239528 4711 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239575 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-config podName:ab93fa35-7435-4886-bcd7-a13ca47bd8fd nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.739562305 +0000 UTC m=+148.408813550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-7brcf" (UID: "ab93fa35-7435-4886-bcd7-a13ca47bd8fd") : failed to sync configmap cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239605 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-service-ca-bundle podName:efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.739585506 +0000 UTC m=+148.408836761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-service-ca-bundle") pod "router-default-5444994796-ng58r" (UID: "efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23") : failed to sync configmap cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.239844 4711 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.240086 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-stats-auth podName:efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.740060588 +0000 UTC m=+148.409311883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-stats-auth") pod "router-default-5444994796-ng58r" (UID: "efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.240201 4711 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.240244 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-webhook-cert podName:1cdfc9ef-2385-43ab-ad70-912c119522c7 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.740236063 +0000 UTC m=+148.409487318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-webhook-cert") pod "packageserver-d55dfcdfc-9zjpk" (UID: "1cdfc9ef-2385-43ab-ad70-912c119522c7") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.240195 4711 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.240280 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-default-certificate podName:efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.740274614 +0000 UTC m=+148.409525869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-default-certificate") pod "router-default-5444994796-ng58r" (UID: "efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.241371 4711 configmap.go:193] Couldn't get configMap openshift-ingress-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: E1203 12:17:09.241442 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e5bc284-86b9-486f-aff2-5f277e323011-trusted-ca podName:8e5bc284-86b9-486f-aff2-5f277e323011 nodeName:}" failed. No retries permitted until 2025-12-03 12:17:09.741422985 +0000 UTC m=+148.410674270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/8e5bc284-86b9-486f-aff2-5f277e323011-trusted-ca") pod "ingress-operator-5b745b69d9-nmlhj" (UID: "8e5bc284-86b9-486f-aff2-5f277e323011") : failed to sync configmap cache: timed out waiting for the condition Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.246389 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba64eaf-060d-417e-8831-f67b27012082-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.250700 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.294200 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 12:17:09 crc kubenswrapper[4711]: W1203 12:17:09.329008 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-e0f9e353309aa5a9cc87aa6f5d0ede2bb79cabdf8ed1adbb240e765a1ff19345 WatchSource:0}: Error finding container e0f9e353309aa5a9cc87aa6f5d0ede2bb79cabdf8ed1adbb240e765a1ff19345: Status 404 returned error can't find the container with id e0f9e353309aa5a9cc87aa6f5d0ede2bb79cabdf8ed1adbb240e765a1ff19345 Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.330868 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.330876 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.351069 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.371617 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.392329 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.410770 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.439881 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.451117 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.470477 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.486810 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e0f9e353309aa5a9cc87aa6f5d0ede2bb79cabdf8ed1adbb240e765a1ff19345"} Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.487768 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"845eab113f33ff3b8479278a594c0e334f43e79bde8808571031748b54187828"} Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.488741 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"33097d0edc49d4153dcfffea1dbbcabafce3770addabe2a751e8902671483904"} Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.491535 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.511053 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.531309 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.551275 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.570549 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.592646 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.626618 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fkb\" (UniqueName: \"kubernetes.io/projected/62513aeb-7bbf-4873-905a-0e52f11618ed-kube-api-access-f6fkb\") pod \"authentication-operator-69f744f599-zpvkj\" (UID: \"62513aeb-7bbf-4873-905a-0e52f11618ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.647842 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f274w\" (UniqueName: \"kubernetes.io/projected/f67c1b42-653d-46be-8cf8-d68fe0541f3e-kube-api-access-f274w\") pod \"console-operator-58897d9998-5c4b6\" (UID: \"f67c1b42-653d-46be-8cf8-d68fe0541f3e\") " pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.665251 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg5qx\" (UniqueName: \"kubernetes.io/projected/96380b31-f010-408a-b4b2-af721875ec8c-kube-api-access-gg5qx\") pod \"console-f9d7485db-fkpkw\" (UID: \"96380b31-f010-408a-b4b2-af721875ec8c\") " pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.696636 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rm56\" (UniqueName: \"kubernetes.io/projected/94c142bf-dba2-4cc2-bf4d-13688d6033b0-kube-api-access-6rm56\") pod \"machine-approver-56656f9798-wxvmz\" (UID: \"94c142bf-dba2-4cc2-bf4d-13688d6033b0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.711229 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdlq8\" (UniqueName: \"kubernetes.io/projected/a0e8a48c-f60d-494e-8928-061c3235d3f1-kube-api-access-wdlq8\") pod \"route-controller-manager-6576b87f9c-7trmd\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.736715 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f49br\" (UniqueName: \"kubernetes.io/projected/cc6e7969-c654-4e5d-b29f-220f2eb0bb58-kube-api-access-f49br\") pod \"apiserver-7bbb656c7d-jp8jl\" (UID: \"cc6e7969-c654-4e5d-b29f-220f2eb0bb58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.751499 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.771415 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.772870 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-default-certificate\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773066 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e5bc284-86b9-486f-aff2-5f277e323011-trusted-ca\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773127 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-webhook-cert\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773175 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773223 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773268 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e5bc284-86b9-486f-aff2-5f277e323011-metrics-tls\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773318 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-metrics-certs\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773382 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-stats-auth\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773539 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-service-ca-bundle\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.773607 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.774893 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e5bc284-86b9-486f-aff2-5f277e323011-trusted-ca\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.774970 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.775052 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-service-ca-bundle\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.776963 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-default-certificate\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.778057 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.778824 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-metrics-certs\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.779652 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-stats-auth\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.780106 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.780231 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cdfc9ef-2385-43ab-ad70-912c119522c7-webhook-cert\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.780748 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e5bc284-86b9-486f-aff2-5f277e323011-metrics-tls\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.791243 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.809061 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.811360 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.817266 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.831148 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.850998 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.871487 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.874629 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.882212 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.890726 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.905557 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.917266 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.920193 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.933325 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.952989 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.973141 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 12:17:09 crc kubenswrapper[4711]: I1203 12:17:09.998354 4711 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.029528 4711 request.go:700] Waited for 1.890219282s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/default/token Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.030988 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxhv\" (UniqueName: \"kubernetes.io/projected/fe2d0a48-2982-4c9b-945b-940c24cda7ff-kube-api-access-9cxhv\") pod \"cluster-samples-operator-665b6dd947-r72zb\" (UID: \"fe2d0a48-2982-4c9b-945b-940c24cda7ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.053000 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdgrv\" (UniqueName: \"kubernetes.io/projected/8cc669b1-f54c-476c-ae6c-81f5bf982b7e-kube-api-access-wdgrv\") pod \"downloads-7954f5f757-x6jjm\" (UID: \"8cc669b1-f54c-476c-ae6c-81f5bf982b7e\") " pod="openshift-console/downloads-7954f5f757-x6jjm" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.077506 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhwsj\" (UniqueName: \"kubernetes.io/projected/064e5e0c-68f0-4d05-b054-17948a298623-kube-api-access-vhwsj\") pod \"machine-api-operator-5694c8668f-hnrkf\" (UID: \"064e5e0c-68f0-4d05-b054-17948a298623\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.096315 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9jv\" (UniqueName: \"kubernetes.io/projected/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-kube-api-access-vq9jv\") pod \"oauth-openshift-558db77b4-bfrlg\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.104778 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9krlj\" (UniqueName: \"kubernetes.io/projected/25c69a31-2a54-4077-9b8c-c859d1d20849-kube-api-access-9krlj\") pod \"controller-manager-879f6c89f-dtc2k\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.130576 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tkkz\" (UniqueName: \"kubernetes.io/projected/e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69-kube-api-access-6tkkz\") pod \"apiserver-76f77b778f-gx6sm\" (UID: \"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69\") " pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.139806 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x6jjm" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.154127 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.154612 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-296hh\" (UniqueName: \"kubernetes.io/projected/e9a71eaa-37ba-4311-8f1f-30b4580cb030-kube-api-access-296hh\") pod \"machine-config-controller-84d6567774-nj2fj\" (UID: \"e9a71eaa-37ba-4311-8f1f-30b4580cb030\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.167441 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4lcl\" (UniqueName: \"kubernetes.io/projected/c2f826dc-7227-4d0d-844c-4b84a5902d7b-kube-api-access-w4lcl\") pod \"openshift-apiserver-operator-796bbdcf4f-5gvpv\" (UID: \"c2f826dc-7227-4d0d-844c-4b84a5902d7b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:10 crc kubenswrapper[4711]: W1203 12:17:10.169196 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0e8a48c_f60d_494e_8928_061c3235d3f1.slice/crio-2ffcfcf90ca529fbcec157e6a9f7884d4443d896b1a2284af811dc267cb01332 WatchSource:0}: Error finding container 2ffcfcf90ca529fbcec157e6a9f7884d4443d896b1a2284af811dc267cb01332: Status 404 returned error can't find the container with id 2ffcfcf90ca529fbcec157e6a9f7884d4443d896b1a2284af811dc267cb01332 Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.186089 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mf9p\" (UniqueName: \"kubernetes.io/projected/09fbd47f-ad50-4d45-8505-f73dd228dd58-kube-api-access-5mf9p\") pod \"openshift-config-operator-7777fb866f-59vns\" (UID: \"09fbd47f-ad50-4d45-8505-f73dd228dd58\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.189157 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.211210 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtxsk\" (UniqueName: \"kubernetes.io/projected/8b25215d-cee5-46c3-8008-d9165ac32b96-kube-api-access-wtxsk\") pod \"kube-storage-version-migrator-operator-b67b599dd-56rtb\" (UID: \"8b25215d-cee5-46c3-8008-d9165ac32b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.219095 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.225469 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zpvkj"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.228601 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.232530 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmx8\" (UniqueName: \"kubernetes.io/projected/42bb9e00-bc53-46ba-96d6-e38a92694817-kube-api-access-bxmx8\") pod \"machine-config-operator-74547568cd-xksnd\" (UID: \"42bb9e00-bc53-46ba-96d6-e38a92694817\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.247637 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqzfw\" (UniqueName: \"kubernetes.io/projected/8e5bc284-86b9-486f-aff2-5f277e323011-kube-api-access-qqzfw\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.255109 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.273860 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.277540 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq7rh\" (UniqueName: \"kubernetes.io/projected/e00519d7-eaff-4176-a850-486dbb03e0cc-kube-api-access-kq7rh\") pod \"catalog-operator-68c6474976-khs4n\" (UID: \"e00519d7-eaff-4176-a850-486dbb03e0cc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.283480 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.297210 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94tcf\" (UniqueName: \"kubernetes.io/projected/9cdc9430-275b-4891-858d-a68d9148058e-kube-api-access-94tcf\") pod \"multus-admission-controller-857f4d67dd-45rj8\" (UID: \"9cdc9430-275b-4891-858d-a68d9148058e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.302523 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.308587 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhxh8\" (UniqueName: \"kubernetes.io/projected/9bc96e8b-e5df-40a7-8690-530895999a16-kube-api-access-jhxh8\") pod \"marketplace-operator-79b997595-c8g7k\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.331517 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwg5t\" (UniqueName: \"kubernetes.io/projected/9ec52f86-45ff-4a9d-8913-a8848e0589e7-kube-api-access-gwg5t\") pod \"package-server-manager-789f6589d5-66k2k\" (UID: \"9ec52f86-45ff-4a9d-8913-a8848e0589e7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.333748 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.337267 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.350869 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7730e767-9d1f-4ac7-867a-08760f9d1990-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s24hc\" (UID: \"7730e767-9d1f-4ac7-867a-08760f9d1990\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.361251 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.368128 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x6jjm"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.368188 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5c4b6"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.368488 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.369338 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9e2da1c-8e56-47d1-930e-6afacb8839e1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.371059 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.377985 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fkpkw"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.395062 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.395947 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab93fa35-7435-4886-bcd7-a13ca47bd8fd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7brcf\" (UID: \"ab93fa35-7435-4886-bcd7-a13ca47bd8fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.409573 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6jxb\" (UniqueName: \"kubernetes.io/projected/b9e2da1c-8e56-47d1-930e-6afacb8839e1-kube-api-access-p6jxb\") pod \"cluster-image-registry-operator-dc59b4c8b-6wll9\" (UID: \"b9e2da1c-8e56-47d1-930e-6afacb8839e1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.435337 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5m9v\" (UniqueName: \"kubernetes.io/projected/f3095a2e-f5c5-436e-953a-1fb6ae1950bb-kube-api-access-h5m9v\") pod \"control-plane-machine-set-operator-78cbb6b69f-zhfzk\" (UID: \"f3095a2e-f5c5-436e-953a-1fb6ae1950bb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.452122 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.457254 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sk2j\" (UniqueName: \"kubernetes.io/projected/01c32d8f-3d2d-4bcf-bb04-620569d59ff5-kube-api-access-2sk2j\") pod \"service-ca-operator-777779d784-dn4mb\" (UID: \"01c32d8f-3d2d-4bcf-bb04-620569d59ff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.473818 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqs9\" (UniqueName: \"kubernetes.io/projected/efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23-kube-api-access-6kqs9\") pod \"router-default-5444994796-ng58r\" (UID: \"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23\") " pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.490784 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9h2\" (UniqueName: \"kubernetes.io/projected/09fb9b94-8286-4064-a6f5-15362c4f300d-kube-api-access-zh9h2\") pod \"service-ca-9c57cc56f-5f942\" (UID: \"09fb9b94-8286-4064-a6f5-15362c4f300d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.502757 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4db4e21ddbfca23b5c446e3ce95cad1550668fc90a6439f07d7b06cca5dc1bc7"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.506900 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6397f9d121cacea168077aea2bdcaf1a20b356e96c85461dcf36041f224e157d"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.507415 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.510151 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c865bfb544c3638cadafca3bf7e1ea3b45a0cb64be0e41b573e1f595ffe9bd79"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.521590 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blvj9\" (UniqueName: \"kubernetes.io/projected/1cdfc9ef-2385-43ab-ad70-912c119522c7-kube-api-access-blvj9\") pod \"packageserver-d55dfcdfc-9zjpk\" (UID: \"1cdfc9ef-2385-43ab-ad70-912c119522c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.530702 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m4m7\" (UniqueName: \"kubernetes.io/projected/2975f7c8-1440-4eac-bccb-8746f327f768-kube-api-access-5m4m7\") pod \"migrator-59844c95c7-zzqqb\" (UID: \"2975f7c8-1440-4eac-bccb-8746f327f768\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.532627 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" event={"ID":"a0e8a48c-f60d-494e-8928-061c3235d3f1","Type":"ContainerStarted","Data":"43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.533225 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.533241 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" event={"ID":"a0e8a48c-f60d-494e-8928-061c3235d3f1","Type":"ContainerStarted","Data":"2ffcfcf90ca529fbcec157e6a9f7884d4443d896b1a2284af811dc267cb01332"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.535729 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5c4b6" event={"ID":"f67c1b42-653d-46be-8cf8-d68fe0541f3e","Type":"ContainerStarted","Data":"06df192031a3aab19d3a19ebaab3cbf5f3b332cebac85d2904a11dece407f14b"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.537841 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" event={"ID":"cc6e7969-c654-4e5d-b29f-220f2eb0bb58","Type":"ContainerStarted","Data":"5c37aab63379e4d1ba246b9152e73cfd890ff336c69f025630973a96f0ddb673"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.539516 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.543142 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" event={"ID":"62513aeb-7bbf-4873-905a-0e52f11618ed","Type":"ContainerStarted","Data":"8b04486f48b45e0939b3158f61632de2da65885b9cc5dacaa0d5955859975d33"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.548047 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8fcw\" (UniqueName: \"kubernetes.io/projected/14ef9276-a825-478b-bee0-674657696c8e-kube-api-access-k8fcw\") pod \"olm-operator-6b444d44fb-q47nn\" (UID: \"14ef9276-a825-478b-bee0-674657696c8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.550034 4711 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7trmd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.550079 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" podUID="a0e8a48c-f60d-494e-8928-061c3235d3f1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.556732 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.563807 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fkpkw" event={"ID":"96380b31-f010-408a-b4b2-af721875ec8c","Type":"ContainerStarted","Data":"4aa05af6b0093de9db59b065277042ef1a96db0a17af4871d54fa1dd53f0e04c"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.569437 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04039569-bc9d-4bbe-972a-235add83f1b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fmpjf\" (UID: \"04039569-bc9d-4bbe-972a-235add83f1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.575212 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.587185 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" event={"ID":"94c142bf-dba2-4cc2-bf4d-13688d6033b0","Type":"ContainerStarted","Data":"2b17e8486986a3bd8d2c4dc29ca19a3c950fc691014b5507764d73f699573113"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.587229 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" event={"ID":"94c142bf-dba2-4cc2-bf4d-13688d6033b0","Type":"ContainerStarted","Data":"3f0f5126b87fb657995c2cedc0a20f2ea5a49d5520d064920b5ac1c21858fa7a"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.589883 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktx6\" (UniqueName: \"kubernetes.io/projected/6ba64eaf-060d-417e-8831-f67b27012082-kube-api-access-mktx6\") pod \"openshift-controller-manager-operator-756b6f6bc6-jdq9h\" (UID: \"6ba64eaf-060d-417e-8831-f67b27012082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.590266 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.598426 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.605298 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x6jjm" event={"ID":"8cc669b1-f54c-476c-ae6c-81f5bf982b7e","Type":"ContainerStarted","Data":"6ca13a59062dbf77d34609f04637edb803d10ee4628c56527deac66670243558"} Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.605600 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.626633 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.634702 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjnsj\" (UniqueName: \"kubernetes.io/projected/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-kube-api-access-gjnsj\") pod \"collect-profiles-29412735-shrp8\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.640111 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.646333 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-829fm\" (UniqueName: \"kubernetes.io/projected/10e059c5-9cf9-46f7-b31a-b97aae40f17a-kube-api-access-829fm\") pod \"dns-operator-744455d44c-mjzpn\" (UID: \"10e059c5-9cf9-46f7-b31a-b97aae40f17a\") " pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.649644 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.656321 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e5bc284-86b9-486f-aff2-5f277e323011-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nmlhj\" (UID: \"8e5bc284-86b9-486f-aff2-5f277e323011\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.680259 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.683815 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-59vns"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.688814 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690270 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e58cccb4-9efa-494f-98c3-fbf61d444804-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690317 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4kcq\" (UniqueName: \"kubernetes.io/projected/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-kube-api-access-m4kcq\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690368 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-client\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690427 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-tls\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690536 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690615 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4n7\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-kube-api-access-cp4n7\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690689 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-config\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690748 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e58cccb4-9efa-494f-98c3-fbf61d444804-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690799 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-bound-sa-token\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: E1203 12:17:10.691276 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.191259943 +0000 UTC m=+149.860511198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.690901 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.691624 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-trusted-ca\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.691685 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-certificates\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.691785 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-ca\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.691859 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-serving-cert\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.703449 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5f942" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.715817 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.720807 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.729201 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.735441 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.752178 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.802149 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.802480 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e58cccb4-9efa-494f-98c3-fbf61d444804-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.802547 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-mountpoint-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.802580 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-bound-sa-token\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.802813 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d641a4b9-732f-4cb2-890c-4170b1450aea-config-volume\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.802865 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-registration-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.802928 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-certs\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803024 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-trusted-ca\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803044 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-socket-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803064 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-node-bootstrap-token\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803103 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-certificates\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803180 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgsf\" (UniqueName: \"kubernetes.io/projected/30e431ee-b97c-4e18-91b0-ebc41960d7a5-kube-api-access-6hgsf\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803313 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-ca\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803332 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/402486c6-ecf5-4f13-9550-724ad7ad6b8b-cert\") pod \"ingress-canary-hxjp8\" (UID: \"402486c6-ecf5-4f13-9550-724ad7ad6b8b\") " pod="openshift-ingress-canary/ingress-canary-hxjp8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803369 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-plugins-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803404 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6pmv\" (UniqueName: \"kubernetes.io/projected/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-kube-api-access-w6pmv\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803496 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-serving-cert\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803573 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e58cccb4-9efa-494f-98c3-fbf61d444804-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: E1203 12:17:10.803720 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.303692839 +0000 UTC m=+149.972944094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803819 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4kcq\" (UniqueName: \"kubernetes.io/projected/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-kube-api-access-m4kcq\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803879 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-client\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.803944 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e58cccb4-9efa-494f-98c3-fbf61d444804-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.804015 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-tls\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.804058 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf9kr\" (UniqueName: \"kubernetes.io/projected/d641a4b9-732f-4cb2-890c-4170b1450aea-kube-api-access-xf9kr\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.804311 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.804372 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-csi-data-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.805889 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4n7\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-kube-api-access-cp4n7\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.806106 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d641a4b9-732f-4cb2-890c-4170b1450aea-metrics-tls\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.806754 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-config\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.806926 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79x5x\" (UniqueName: \"kubernetes.io/projected/402486c6-ecf5-4f13-9550-724ad7ad6b8b-kube-api-access-79x5x\") pod \"ingress-canary-hxjp8\" (UID: \"402486c6-ecf5-4f13-9550-724ad7ad6b8b\") " pod="openshift-ingress-canary/ingress-canary-hxjp8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.807060 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-ca\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.812609 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-certificates\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.812712 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-trusted-ca\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.814458 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.814633 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-config\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.822309 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-serving-cert\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.822614 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e58cccb4-9efa-494f-98c3-fbf61d444804-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.835849 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-etcd-client\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.840221 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-tls\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.864379 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-bound-sa-token\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.869160 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4kcq\" (UniqueName: \"kubernetes.io/projected/1a334f3a-66dc-447a-822c-c22cc1f4fd2f-kube-api-access-m4kcq\") pod \"etcd-operator-b45778765-hm68r\" (UID: \"1a334f3a-66dc-447a-822c-c22cc1f4fd2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.896025 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd"] Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.901826 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4n7\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-kube-api-access-cp4n7\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909201 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d641a4b9-732f-4cb2-890c-4170b1450aea-config-volume\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909254 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909277 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-registration-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909307 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-certs\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909342 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-socket-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909360 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-node-bootstrap-token\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909390 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgsf\" (UniqueName: \"kubernetes.io/projected/30e431ee-b97c-4e18-91b0-ebc41960d7a5-kube-api-access-6hgsf\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909431 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/402486c6-ecf5-4f13-9550-724ad7ad6b8b-cert\") pod \"ingress-canary-hxjp8\" (UID: \"402486c6-ecf5-4f13-9550-724ad7ad6b8b\") " pod="openshift-ingress-canary/ingress-canary-hxjp8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909457 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-plugins-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909482 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6pmv\" (UniqueName: \"kubernetes.io/projected/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-kube-api-access-w6pmv\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909543 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf9kr\" (UniqueName: \"kubernetes.io/projected/d641a4b9-732f-4cb2-890c-4170b1450aea-kube-api-access-xf9kr\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909599 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-csi-data-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909632 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d641a4b9-732f-4cb2-890c-4170b1450aea-metrics-tls\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909681 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79x5x\" (UniqueName: \"kubernetes.io/projected/402486c6-ecf5-4f13-9550-724ad7ad6b8b-kube-api-access-79x5x\") pod \"ingress-canary-hxjp8\" (UID: \"402486c6-ecf5-4f13-9550-724ad7ad6b8b\") " pod="openshift-ingress-canary/ingress-canary-hxjp8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.909718 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-mountpoint-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.910298 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-plugins-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: E1203 12:17:10.910655 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.410639349 +0000 UTC m=+150.079890604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.911066 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-csi-data-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.911196 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-registration-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.911671 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d641a4b9-732f-4cb2-890c-4170b1450aea-config-volume\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.911717 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-mountpoint-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.912221 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/30e431ee-b97c-4e18-91b0-ebc41960d7a5-socket-dir\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.913426 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.917401 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-certs\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.919576 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-node-bootstrap-token\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.920339 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/402486c6-ecf5-4f13-9550-724ad7ad6b8b-cert\") pod \"ingress-canary-hxjp8\" (UID: \"402486c6-ecf5-4f13-9550-724ad7ad6b8b\") " pod="openshift-ingress-canary/ingress-canary-hxjp8" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.926880 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d641a4b9-732f-4cb2-890c-4170b1450aea-metrics-tls\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.947338 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6pmv\" (UniqueName: \"kubernetes.io/projected/2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261-kube-api-access-w6pmv\") pod \"machine-config-server-9zqgp\" (UID: \"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261\") " pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:10 crc kubenswrapper[4711]: I1203 12:17:10.968547 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf9kr\" (UniqueName: \"kubernetes.io/projected/d641a4b9-732f-4cb2-890c-4170b1450aea-kube-api-access-xf9kr\") pod \"dns-default-fh86s\" (UID: \"d641a4b9-732f-4cb2-890c-4170b1450aea\") " pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.000574 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79x5x\" (UniqueName: \"kubernetes.io/projected/402486c6-ecf5-4f13-9550-724ad7ad6b8b-kube-api-access-79x5x\") pod \"ingress-canary-hxjp8\" (UID: \"402486c6-ecf5-4f13-9550-724ad7ad6b8b\") " pod="openshift-ingress-canary/ingress-canary-hxjp8" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.010371 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.012529 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.512499923 +0000 UTC m=+150.181751178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.012652 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.013042 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.513032477 +0000 UTC m=+150.182283732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.031716 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgsf\" (UniqueName: \"kubernetes.io/projected/30e431ee-b97c-4e18-91b0-ebc41960d7a5-kube-api-access-6hgsf\") pod \"csi-hostpathplugin-bs88v\" (UID: \"30e431ee-b97c-4e18-91b0-ebc41960d7a5\") " pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.067791 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9zqgp" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.068429 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.099970 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hxjp8" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.108425 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bs88v" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.113823 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.113936 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.613892735 +0000 UTC m=+150.283143990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.126217 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.126523 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.626509971 +0000 UTC m=+150.295761226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.139457 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.227454 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.227950 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.727893383 +0000 UTC m=+150.397144638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.228031 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.228405 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.728389656 +0000 UTC m=+150.397640911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.329417 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.329586 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.829561802 +0000 UTC m=+150.498813057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.331549 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.332450 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.832431018 +0000 UTC m=+150.501682273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.329439 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hnrkf"] Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.427005 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bfrlg"] Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.429664 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gx6sm"] Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.432577 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.433107 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:11.933074431 +0000 UTC m=+150.602325686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.538772 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.539494 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.039476656 +0000 UTC m=+150.708727921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.616620 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" event={"ID":"e9a71eaa-37ba-4311-8f1f-30b4580cb030","Type":"ContainerStarted","Data":"9590cbf0ba3afea99a39a2a276bacb59424db405fb9245e293a4118e233f33e8"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.616676 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" event={"ID":"e9a71eaa-37ba-4311-8f1f-30b4580cb030","Type":"ContainerStarted","Data":"ee3d8cf7eff518db0768cd5061cbcf46c0959734b09d13d558ba6702dc243695"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.619095 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" event={"ID":"62513aeb-7bbf-4873-905a-0e52f11618ed","Type":"ContainerStarted","Data":"faac6204ba625a7e8ab5cf298b0f6e71c780d0b6b3c4fc5d0d7e55a92f6bef9e"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.623966 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fkpkw" event={"ID":"96380b31-f010-408a-b4b2-af721875ec8c","Type":"ContainerStarted","Data":"9c7840115e3feae97f84855ba8f28191076e0d504f750d63e5c4bf39ae3fb158"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.642532 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.643469 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.143449467 +0000 UTC m=+150.812700722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.644571 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5c4b6" event={"ID":"f67c1b42-653d-46be-8cf8-d68fe0541f3e","Type":"ContainerStarted","Data":"30f1d90e4feba771b9bf34c464a14f45371f2e0369bcc493f7d71a28c04d47db"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.653970 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.657509 4711 patch_prober.go:28] interesting pod/console-operator-58897d9998-5c4b6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.657630 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5c4b6" podUID="f67c1b42-653d-46be-8cf8-d68fe0541f3e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.687218 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ng58r" event={"ID":"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23","Type":"ContainerStarted","Data":"2baca4b9bbe4c1a45048ed12925a36bd971b061bc2954dc1782962d1c822491d"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.687265 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ng58r" event={"ID":"efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23","Type":"ContainerStarted","Data":"96fb4431d98655dbfd83e1c1a24568fed503b9dd7f0b28d1e2e7d99ca34b93b6"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.692113 4711 generic.go:334] "Generic (PLEG): container finished" podID="cc6e7969-c654-4e5d-b29f-220f2eb0bb58" containerID="1f6f02ff8e7d6c7953218f0246d90fdedaba9f3efcd393f63b7c156fd0b992db" exitCode=0 Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.692191 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" event={"ID":"cc6e7969-c654-4e5d-b29f-220f2eb0bb58","Type":"ContainerDied","Data":"1f6f02ff8e7d6c7953218f0246d90fdedaba9f3efcd393f63b7c156fd0b992db"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.694029 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9zqgp" event={"ID":"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261","Type":"ContainerStarted","Data":"dfb4e5c62cf6949f408b77042232c15d6a15e349c6d575817496e26ecea64cdc"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.700412 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" event={"ID":"94c142bf-dba2-4cc2-bf4d-13688d6033b0","Type":"ContainerStarted","Data":"cf30ffb605a35727e8f226dfe7d23fadbfdd899023d70a86d395ea2dc08b4d39"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.705183 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" event={"ID":"42bb9e00-bc53-46ba-96d6-e38a92694817","Type":"ContainerStarted","Data":"5ec0c2c0627b39ce888722e997b2532844a0331ebb442c085eb1669dbf03a898"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.708613 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" event={"ID":"09fbd47f-ad50-4d45-8505-f73dd228dd58","Type":"ContainerStarted","Data":"4e23f248769c815bda33acfef477ef5f81b1324b609f0269c0f6391b04e97fe8"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.708669 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" event={"ID":"09fbd47f-ad50-4d45-8505-f73dd228dd58","Type":"ContainerStarted","Data":"d4bda4f4ca9349f248e125a841ba05c170eeafd9ab9e861ac54a7dbae506a991"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.717866 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" event={"ID":"fe2d0a48-2982-4c9b-945b-940c24cda7ff","Type":"ContainerStarted","Data":"e97abbbb503cd3131c984485ae5eade81f346f16c370da5a9547fca2202cdf4d"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.725280 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x6jjm" event={"ID":"8cc669b1-f54c-476c-ae6c-81f5bf982b7e","Type":"ContainerStarted","Data":"3ade4f4cff8883df72f03db5590c6aa49c03a85985f8665a8d4a6dafc217408a"} Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.736840 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.743845 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.745565 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.245552269 +0000 UTC m=+150.914803514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.752793 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.847658 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.850743 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.35071218 +0000 UTC m=+151.019963445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:11 crc kubenswrapper[4711]: I1203 12:17:11.950424 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:11 crc kubenswrapper[4711]: E1203 12:17:11.950858 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.450838959 +0000 UTC m=+151.120090214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.006010 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" podStartSLOduration=130.005992694 podStartE2EDuration="2m10.005992694s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:12.003709523 +0000 UTC m=+150.672960788" watchObservedRunningTime="2025-12-03 12:17:12.005992694 +0000 UTC m=+150.675243949" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.021465 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:12 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:12 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:12 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.021524 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.052452 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.052827 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.552813157 +0000 UTC m=+151.222064412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.153762 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.154059 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.654045586 +0000 UTC m=+151.323296841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.254466 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.254959 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.754939155 +0000 UTC m=+151.424190410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.356698 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.357184 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.857166229 +0000 UTC m=+151.526417484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.459558 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.460254 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:12.960231966 +0000 UTC m=+151.629483221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.528907 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv"] Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.566755 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.567154 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.067142874 +0000 UTC m=+151.736394129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: W1203 12:17:12.572149 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f826dc_7227_4d0d_844c_4b84a5902d7b.slice/crio-a312da3bcf7d52b75c03eac41e02338232ca00a7fed00685a04246ec426eaf72 WatchSource:0}: Error finding container a312da3bcf7d52b75c03eac41e02338232ca00a7fed00685a04246ec426eaf72: Status 404 returned error can't find the container with id a312da3bcf7d52b75c03eac41e02338232ca00a7fed00685a04246ec426eaf72 Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.628765 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtc2k"] Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.628821 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc"] Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.667720 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.668044 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.168029803 +0000 UTC m=+151.837281058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.681974 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n"] Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.682900 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb"] Dec 03 12:17:12 crc kubenswrapper[4711]: W1203 12:17:12.699591 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25c69a31_2a54_4077_9b8c_c859d1d20849.slice/crio-531619ce199a4bd5a6ed146c954693cda7dad75deb1fd460c4562fa41947d4d6 WatchSource:0}: Error finding container 531619ce199a4bd5a6ed146c954693cda7dad75deb1fd460c4562fa41947d4d6: Status 404 returned error can't find the container with id 531619ce199a4bd5a6ed146c954693cda7dad75deb1fd460c4562fa41947d4d6 Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.716620 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-fkpkw" podStartSLOduration=130.716600453 podStartE2EDuration="2m10.716600453s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:12.715481473 +0000 UTC m=+151.384732748" watchObservedRunningTime="2025-12-03 12:17:12.716600453 +0000 UTC m=+151.385851708" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.740179 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-zpvkj" podStartSLOduration=130.740158479 podStartE2EDuration="2m10.740158479s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:12.739524832 +0000 UTC m=+151.408776107" watchObservedRunningTime="2025-12-03 12:17:12.740158479 +0000 UTC m=+151.409409734" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.775376 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:12 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:12 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:12 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.775424 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.775794 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxvmz" podStartSLOduration=130.775781895 podStartE2EDuration="2m10.775781895s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:12.773973307 +0000 UTC m=+151.443224582" watchObservedRunningTime="2025-12-03 12:17:12.775781895 +0000 UTC m=+151.445033150" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.776327 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.776752 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.2767357 +0000 UTC m=+151.945986955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.781475 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" event={"ID":"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b","Type":"ContainerStarted","Data":"69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.781514 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" event={"ID":"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b","Type":"ContainerStarted","Data":"baf1f49578cc09772b352d62c0fd8a37130dc9d44303cdb1797a8af8200deb95"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.782337 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.791470 4711 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bfrlg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.791539 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" podUID="4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.795019 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" event={"ID":"25c69a31-2a54-4077-9b8c-c859d1d20849","Type":"ContainerStarted","Data":"531619ce199a4bd5a6ed146c954693cda7dad75deb1fd460c4562fa41947d4d6"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.799586 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb"] Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.802329 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9zqgp" event={"ID":"2bc8f1ff-9c7a-41db-90ea-d4b1a9ce6261","Type":"ContainerStarted","Data":"639ab36c8e7b261d0c56daaa3357f52466271bc9d3efd46f66d67dc2b4185ae2"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.803921 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" event={"ID":"8b25215d-cee5-46c3-8008-d9165ac32b96","Type":"ContainerStarted","Data":"d6d5d11ab81d4cad08049f9a5ce4dc45dd3cc60c0edcc123b00794a95e5b6787"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.805049 4711 generic.go:334] "Generic (PLEG): container finished" podID="e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69" containerID="9edc1872b669f74170de762075915b6c475b3082cbee11e34a9a035dd5961a09" exitCode=0 Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.805085 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" event={"ID":"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69","Type":"ContainerDied","Data":"9edc1872b669f74170de762075915b6c475b3082cbee11e34a9a035dd5961a09"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.805099 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" event={"ID":"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69","Type":"ContainerStarted","Data":"d04178e17b6b457486c708f1595296d4a43d1d8d955823b9e5fd6061c997c0e2"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.830185 4711 generic.go:334] "Generic (PLEG): container finished" podID="09fbd47f-ad50-4d45-8505-f73dd228dd58" containerID="4e23f248769c815bda33acfef477ef5f81b1324b609f0269c0f6391b04e97fe8" exitCode=0 Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.830619 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" event={"ID":"09fbd47f-ad50-4d45-8505-f73dd228dd58","Type":"ContainerDied","Data":"4e23f248769c815bda33acfef477ef5f81b1324b609f0269c0f6391b04e97fe8"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.842182 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9"] Dec 03 12:17:12 crc kubenswrapper[4711]: W1203 12:17:12.846482 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c32d8f_3d2d_4bcf_bb04_620569d59ff5.slice/crio-73ee63aa4ecfc880aa099873939c50eafe5c4fb713dd4a3052f50ac8b28075a8 WatchSource:0}: Error finding container 73ee63aa4ecfc880aa099873939c50eafe5c4fb713dd4a3052f50ac8b28075a8: Status 404 returned error can't find the container with id 73ee63aa4ecfc880aa099873939c50eafe5c4fb713dd4a3052f50ac8b28075a8 Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.847793 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" event={"ID":"42bb9e00-bc53-46ba-96d6-e38a92694817","Type":"ContainerStarted","Data":"2fba5b51378f92fa07c2624319d5a718ad140db9d86702acb1571d0063a0304a"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.847873 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" event={"ID":"42bb9e00-bc53-46ba-96d6-e38a92694817","Type":"ContainerStarted","Data":"d0c74748722bde2ef660135566b098c0db5fe2cb582f1a5dd21edd360d35964b"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.861476 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5c4b6" podStartSLOduration=130.861449309 podStartE2EDuration="2m10.861449309s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:12.846825571 +0000 UTC m=+151.516076846" watchObservedRunningTime="2025-12-03 12:17:12.861449309 +0000 UTC m=+151.530700584" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.877423 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.878877 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.378854701 +0000 UTC m=+152.048105966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.891466 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" event={"ID":"fe2d0a48-2982-4c9b-945b-940c24cda7ff","Type":"ContainerStarted","Data":"c0278e515fc8206dccb4a59c60e41f39588d9fa6958cefe2457469d1cee230b5"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.891502 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" event={"ID":"fe2d0a48-2982-4c9b-945b-940c24cda7ff","Type":"ContainerStarted","Data":"0532abd65616c6f74c64aea226b0aa7c2db74dcd1630e5b38a68122c4e4a5ce7"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.917925 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" event={"ID":"cc6e7969-c654-4e5d-b29f-220f2eb0bb58","Type":"ContainerStarted","Data":"edfc630dd1f510a44864395ab8feb08dd9664df626693c8a241c82942f37eb1c"} Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.978807 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-x6jjm" podStartSLOduration=130.978791015 podStartE2EDuration="2m10.978791015s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:12.891060636 +0000 UTC m=+151.560311911" watchObservedRunningTime="2025-12-03 12:17:12.978791015 +0000 UTC m=+151.648042270" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.979293 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ng58r" podStartSLOduration=130.979289818 podStartE2EDuration="2m10.979289818s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:12.958171038 +0000 UTC m=+151.627422313" watchObservedRunningTime="2025-12-03 12:17:12.979289818 +0000 UTC m=+151.648541073" Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.979831 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:12 crc kubenswrapper[4711]: E1203 12:17:12.980212 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.480200002 +0000 UTC m=+152.149451257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:12 crc kubenswrapper[4711]: I1203 12:17:12.980441 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" event={"ID":"e00519d7-eaff-4176-a850-486dbb03e0cc","Type":"ContainerStarted","Data":"751cfbbcf128a057f70ff617b9b64c1d11f3041ed1cdf920b6c74e6d27b99f78"} Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.025004 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" event={"ID":"e9a71eaa-37ba-4311-8f1f-30b4580cb030","Type":"ContainerStarted","Data":"5d9c46498f775fd248fcd7b4d474064221c06e10ca8ba824232f1371cb4ab97e"} Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.038762 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" event={"ID":"c2f826dc-7227-4d0d-844c-4b84a5902d7b","Type":"ContainerStarted","Data":"a312da3bcf7d52b75c03eac41e02338232ca00a7fed00685a04246ec426eaf72"} Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.062309 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r72zb" podStartSLOduration=131.062284192 podStartE2EDuration="2m11.062284192s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:13.003362608 +0000 UTC m=+151.672613873" watchObservedRunningTime="2025-12-03 12:17:13.062284192 +0000 UTC m=+151.731535447" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.080428 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.082563 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.582519089 +0000 UTC m=+152.251770354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.092807 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" podStartSLOduration=131.092780662 podStartE2EDuration="2m11.092780662s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:13.089427892 +0000 UTC m=+151.758679157" watchObservedRunningTime="2025-12-03 12:17:13.092780662 +0000 UTC m=+151.762031917" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.134067 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" event={"ID":"064e5e0c-68f0-4d05-b054-17948a298623","Type":"ContainerStarted","Data":"58079bda340f00d788befae56491519f2515077f076ad056576c4a8d85ca9be5"} Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.134108 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" event={"ID":"064e5e0c-68f0-4d05-b054-17948a298623","Type":"ContainerStarted","Data":"0bee6dcc6fc2b9ad601f678f83bfc62b8183cdd6e46e4c207767402a6aeea785"} Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.134119 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" event={"ID":"064e5e0c-68f0-4d05-b054-17948a298623","Type":"ContainerStarted","Data":"db4c7e612c64adf5970988398b7258a26b8958ddc242699973f28c9c864575c1"} Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.145355 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x6jjm" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.154559 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5c4b6" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.165890 4711 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6jjm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.166136 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x6jjm" podUID="8cc669b1-f54c-476c-ae6c-81f5bf982b7e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.182791 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.183166 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.683151502 +0000 UTC m=+152.352402767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.183307 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.213766 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xksnd" podStartSLOduration=131.213744744 podStartE2EDuration="2m11.213744744s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:13.212537641 +0000 UTC m=+151.881788896" watchObservedRunningTime="2025-12-03 12:17:13.213744744 +0000 UTC m=+151.882995999" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.237142 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hxjp8"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.242177 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5f942"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.256951 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.261231 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.286750 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.287828 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.78779378 +0000 UTC m=+152.457045045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.290175 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.291233 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.791215682 +0000 UTC m=+152.460466937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.300688 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.308553 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" podStartSLOduration=131.308518911 podStartE2EDuration="2m11.308518911s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:13.238728768 +0000 UTC m=+151.907980033" watchObservedRunningTime="2025-12-03 12:17:13.308518911 +0000 UTC m=+151.977770166" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.322690 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf"] Dec 03 12:17:13 crc kubenswrapper[4711]: W1203 12:17:13.325856 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fb9b94_8286_4064_a6f5_15362c4f300d.slice/crio-f3e869b263cdec3aeb015b34cc230d18113d9fbcad16901f77b737cc284d639e WatchSource:0}: Error finding container f3e869b263cdec3aeb015b34cc230d18113d9fbcad16901f77b737cc284d639e: Status 404 returned error can't find the container with id f3e869b263cdec3aeb015b34cc230d18113d9fbcad16901f77b737cc284d639e Dec 03 12:17:13 crc kubenswrapper[4711]: W1203 12:17:13.332228 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec52f86_45ff_4a9d_8913_a8848e0589e7.slice/crio-33eee1f65b305acc1b61a51dcd88eaba40a51135041fafe113f6d6e0ec08a7d8 WatchSource:0}: Error finding container 33eee1f65b305acc1b61a51dcd88eaba40a51135041fafe113f6d6e0ec08a7d8: Status 404 returned error can't find the container with id 33eee1f65b305acc1b61a51dcd88eaba40a51135041fafe113f6d6e0ec08a7d8 Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.333683 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9zqgp" podStartSLOduration=6.333664719 podStartE2EDuration="6.333664719s" podCreationTimestamp="2025-12-03 12:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:13.254344822 +0000 UTC m=+151.923596077" watchObservedRunningTime="2025-12-03 12:17:13.333664719 +0000 UTC m=+152.002915974" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.343400 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-hnrkf" podStartSLOduration=131.343368776 podStartE2EDuration="2m11.343368776s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:13.302248724 +0000 UTC m=+151.971499999" watchObservedRunningTime="2025-12-03 12:17:13.343368776 +0000 UTC m=+152.012620031" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.346221 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c8g7k"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.346298 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mjzpn"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.393347 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.397629 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.401519 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.402473 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:13.902447274 +0000 UTC m=+152.571698529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.430994 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.435588 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.436787 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.439712 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-45rj8"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.440823 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fh86s"] Dec 03 12:17:13 crc kubenswrapper[4711]: W1203 12:17:13.443566 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04039569_bc9d_4bbe_972a_235add83f1b8.slice/crio-85f15eba8caa2941f206c02e90e1740b8165fbe044aec4dc8caee024f9743a58 WatchSource:0}: Error finding container 85f15eba8caa2941f206c02e90e1740b8165fbe044aec4dc8caee024f9743a58: Status 404 returned error can't find the container with id 85f15eba8caa2941f206c02e90e1740b8165fbe044aec4dc8caee024f9743a58 Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.445533 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm68r"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.458389 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nj2fj" podStartSLOduration=131.458365819 podStartE2EDuration="2m11.458365819s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:13.379855875 +0000 UTC m=+152.049107130" watchObservedRunningTime="2025-12-03 12:17:13.458365819 +0000 UTC m=+152.127617074" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.463979 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bs88v"] Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.464536 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" podStartSLOduration=131.464513012 podStartE2EDuration="2m11.464513012s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:13.40266958 +0000 UTC m=+152.071920855" watchObservedRunningTime="2025-12-03 12:17:13.464513012 +0000 UTC m=+152.133764267" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.508055 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.508418 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.008404008 +0000 UTC m=+152.677655263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.610638 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.610802 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.110771797 +0000 UTC m=+152.780023052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.610963 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.611345 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.111329281 +0000 UTC m=+152.780580536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.715408 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.716019 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.21599876 +0000 UTC m=+152.885250005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.759103 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:13 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:13 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:13 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.759168 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.816723 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.821978 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.317149826 +0000 UTC m=+152.986401091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:13 crc kubenswrapper[4711]: I1203 12:17:13.918308 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:13 crc kubenswrapper[4711]: E1203 12:17:13.919306 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.419281499 +0000 UTC m=+153.088532754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.025286 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.025643 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.525628303 +0000 UTC m=+153.194879558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.136212 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.137436 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.637414931 +0000 UTC m=+153.306666186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.141218 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.141624 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.641610002 +0000 UTC m=+153.310861267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.242197 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.246108 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.746086026 +0000 UTC m=+153.415337281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.317430 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" event={"ID":"b9e2da1c-8e56-47d1-930e-6afacb8839e1","Type":"ContainerStarted","Data":"2e3c8aeeb73e004a5f47690dac918068dad84b87630aa85fee77816c933ad4f0"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.317483 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" event={"ID":"b9e2da1c-8e56-47d1-930e-6afacb8839e1","Type":"ContainerStarted","Data":"4c1d1a791c1868d92d142680556a10f4ad4c365b51e454bed77eea49168209a2"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.341301 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" event={"ID":"f3095a2e-f5c5-436e-953a-1fb6ae1950bb","Type":"ContainerStarted","Data":"146059bf905580fd221a006f157fc5527fef9c4e7447d5108baa51d216127416"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.341711 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" event={"ID":"f3095a2e-f5c5-436e-953a-1fb6ae1950bb","Type":"ContainerStarted","Data":"05c4c957eec2e1d97115fae0e37dd5c71281dad381407590c4503be1160c6b93"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.355095 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wll9" podStartSLOduration=132.35507596 podStartE2EDuration="2m12.35507596s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:14.343877893 +0000 UTC m=+153.013129158" watchObservedRunningTime="2025-12-03 12:17:14.35507596 +0000 UTC m=+153.024327215" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.356010 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.357155 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.857143105 +0000 UTC m=+153.526394360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.388390 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zhfzk" podStartSLOduration=132.388370144 podStartE2EDuration="2m12.388370144s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:14.386184366 +0000 UTC m=+153.055435621" watchObservedRunningTime="2025-12-03 12:17:14.388370144 +0000 UTC m=+153.057621399" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.412610 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hxjp8" event={"ID":"402486c6-ecf5-4f13-9550-724ad7ad6b8b","Type":"ContainerStarted","Data":"fd0da858aab624345df17851c48df96d8208a713b2d59e28b0a92d951408beae"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.412654 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hxjp8" event={"ID":"402486c6-ecf5-4f13-9550-724ad7ad6b8b","Type":"ContainerStarted","Data":"0c0dd725d9429ec20a551f5ba00fd1e36eaffbeff5701a0313a2953305650d28"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.458864 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.462886 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:14.962865963 +0000 UTC m=+153.632117218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.470235 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" event={"ID":"04039569-bc9d-4bbe-972a-235add83f1b8","Type":"ContainerStarted","Data":"85f15eba8caa2941f206c02e90e1740b8165fbe044aec4dc8caee024f9743a58"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.492416 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hxjp8" podStartSLOduration=7.492385466 podStartE2EDuration="7.492385466s" podCreationTimestamp="2025-12-03 12:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:14.465531714 +0000 UTC m=+153.134782959" watchObservedRunningTime="2025-12-03 12:17:14.492385466 +0000 UTC m=+153.161636731" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.527611 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" event={"ID":"09fbd47f-ad50-4d45-8505-f73dd228dd58","Type":"ContainerStarted","Data":"c6b6c97813bb2c92dd19efdda12d630c2567b0d252a6d3758af850b27e5730e6"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.528027 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.541479 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" event={"ID":"e00519d7-eaff-4176-a850-486dbb03e0cc","Type":"ContainerStarted","Data":"8710d443a01aa82bb647bdbdfd52d6775aa7eefd5fe610b97f9cc38924409e24"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.543016 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.558044 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" podStartSLOduration=132.55802797 podStartE2EDuration="2m12.55802797s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:14.556316964 +0000 UTC m=+153.225568219" watchObservedRunningTime="2025-12-03 12:17:14.55802797 +0000 UTC m=+153.227279225" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.563451 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.564606 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" event={"ID":"10e059c5-9cf9-46f7-b31a-b97aae40f17a","Type":"ContainerStarted","Data":"79d983967fb27d9efac01aa9fe157da0671b12b2afdfa9fa42c2cb6d1cda36c6"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.564659 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" event={"ID":"10e059c5-9cf9-46f7-b31a-b97aae40f17a","Type":"ContainerStarted","Data":"84e17e88713651178bd5868979cde48227f90b39c127fbda4f6fd2fd0dcb631d"} Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.575641 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.075607706 +0000 UTC m=+153.744858961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.598320 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.598559 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" event={"ID":"9bc96e8b-e5df-40a7-8690-530895999a16","Type":"ContainerStarted","Data":"be1e6648597ceae7a9cac58db6281d0199ed6f6badadab03e47efd8dd3d7ba2c"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.598993 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.619091 4711 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-c8g7k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.619147 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.623523 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khs4n" podStartSLOduration=132.623511648 podStartE2EDuration="2m12.623511648s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:14.623140338 +0000 UTC m=+153.292391603" watchObservedRunningTime="2025-12-03 12:17:14.623511648 +0000 UTC m=+153.292762903" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.673167 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5f942" event={"ID":"09fb9b94-8286-4064-a6f5-15362c4f300d","Type":"ContainerStarted","Data":"c52001b76e71ce23bbdc60c5a7d94a6e07196fd323ed0738900e26454bd87da7"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.673205 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5f942" event={"ID":"09fb9b94-8286-4064-a6f5-15362c4f300d","Type":"ContainerStarted","Data":"f3e869b263cdec3aeb015b34cc230d18113d9fbcad16901f77b737cc284d639e"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.676937 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.677261 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.177245085 +0000 UTC m=+153.846496340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.718751 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" event={"ID":"1cdfc9ef-2385-43ab-ad70-912c119522c7","Type":"ContainerStarted","Data":"ee2e270112f68b562773aa99b4b63a4ccb159607a6d31396f143685e060911ca"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.718796 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" event={"ID":"1cdfc9ef-2385-43ab-ad70-912c119522c7","Type":"ContainerStarted","Data":"31eb64165f678a5640d963bab316d838d9acab1fde413e9f2b147c96dc94f2ad"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.721767 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.728553 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" event={"ID":"9ec52f86-45ff-4a9d-8913-a8848e0589e7","Type":"ContainerStarted","Data":"db8d00a1117b73370c32f798b24dd06894ae0721d3a0edf36941212728da1036"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.728600 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" event={"ID":"9ec52f86-45ff-4a9d-8913-a8848e0589e7","Type":"ContainerStarted","Data":"33eee1f65b305acc1b61a51dcd88eaba40a51135041fafe113f6d6e0ec08a7d8"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.729622 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" event={"ID":"ab93fa35-7435-4886-bcd7-a13ca47bd8fd","Type":"ContainerStarted","Data":"91768d6b50220843e208c0d76c5fa1c693585e889ab1c6b1a053c806895e294d"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.731045 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" event={"ID":"25c69a31-2a54-4077-9b8c-c859d1d20849","Type":"ContainerStarted","Data":"bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.732095 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.733223 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bs88v" event={"ID":"30e431ee-b97c-4e18-91b0-ebc41960d7a5","Type":"ContainerStarted","Data":"6d5d99717928c54107d4ec394e3e60cd88d911c871a5a3b62760f46fc934f8dd"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.734517 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" event={"ID":"14ef9276-a825-478b-bee0-674657696c8e","Type":"ContainerStarted","Data":"e93753e69b28b349204358ee37ad4247511c97e6a6e27438e74d8d08e295221c"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.735109 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.736498 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" event={"ID":"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95","Type":"ContainerStarted","Data":"6b05814fa7f6ebe354236343a49e6ec344be2ae41fd36db28ee2c72d6a7ca011"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.738335 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" event={"ID":"6ba64eaf-060d-417e-8831-f67b27012082","Type":"ContainerStarted","Data":"a43f229935473eedcae0133a34ace24c3978266fe45702c7f48da90481cab1d4"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.739773 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" event={"ID":"2975f7c8-1440-4eac-bccb-8746f327f768","Type":"ContainerStarted","Data":"68e2aaa938fc2f99554be2335983e85395d43747d6adbb172c25c8829afd7f58"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.739795 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" event={"ID":"2975f7c8-1440-4eac-bccb-8746f327f768","Type":"ContainerStarted","Data":"258fb8f9cb971e8be4d281797e17480a2eba763ddc0c89426f13a46f4bcf3265"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.741847 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" event={"ID":"7730e767-9d1f-4ac7-867a-08760f9d1990","Type":"ContainerStarted","Data":"5f82d23407d00e26e20670e482bc113ac714eb863dcf41c2434ea7c1b267eae5"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.741868 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" event={"ID":"7730e767-9d1f-4ac7-867a-08760f9d1990","Type":"ContainerStarted","Data":"99f12ea58dfdfbb41adcb2e800852269cace74200d20bfafed24ce9bd0b32f1a"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.743652 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5gvpv" event={"ID":"c2f826dc-7227-4d0d-844c-4b84a5902d7b","Type":"ContainerStarted","Data":"bc0efac82ead64ee8be84c3c1f2d95978bb75fafc7746637849a7da965b17f4f"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.745746 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" event={"ID":"9cdc9430-275b-4891-858d-a68d9148058e","Type":"ContainerStarted","Data":"3c7f2b29319eca9011e5ccba1ff590d57151b80e3719f94ba258092678cdc69d"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.815634 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.815972 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.315956718 +0000 UTC m=+153.985207973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.822176 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" event={"ID":"01c32d8f-3d2d-4bcf-bb04-620569d59ff5","Type":"ContainerStarted","Data":"e982282f65a07c8db716bdb07f68a0b773c04b6b58e0428f8e992196bf2d5b10"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.822227 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" event={"ID":"01c32d8f-3d2d-4bcf-bb04-620569d59ff5","Type":"ContainerStarted","Data":"73ee63aa4ecfc880aa099873939c50eafe5c4fb713dd4a3052f50ac8b28075a8"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.827993 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:14 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:14 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:14 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.828305 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.830076 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fh86s" event={"ID":"d641a4b9-732f-4cb2-890c-4170b1450aea","Type":"ContainerStarted","Data":"9bbd2aedc62cd1c018b4752819520d1e9b1b3dea5e27620714111c902c286548"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.831243 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" event={"ID":"1a334f3a-66dc-447a-822c-c22cc1f4fd2f","Type":"ContainerStarted","Data":"5eddfdeaa5a4338fe437b998bd66cb6432f7372e0af546b042cff869e746021b"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.835132 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" event={"ID":"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69","Type":"ContainerStarted","Data":"798d1acdca74bf90630202990b4927fb6d46374d80dd9aeedb0f63d1f0cf57f1"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.836748 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" event={"ID":"8e5bc284-86b9-486f-aff2-5f277e323011","Type":"ContainerStarted","Data":"1af6c29d27cbb5e8af75d3d5612b02da288469165095254b836d185110e52d54"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.839645 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" event={"ID":"8b25215d-cee5-46c3-8008-d9165ac32b96","Type":"ContainerStarted","Data":"8a08b21cd7834963a62a4d8a90ccf480cbac1e6a4da46ef589f28f3158ff16f9"} Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.850567 4711 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6jjm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.850636 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x6jjm" podUID="8cc669b1-f54c-476c-ae6c-81f5bf982b7e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.882774 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.882886 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.886419 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" podStartSLOduration=132.886407139 podStartE2EDuration="2m12.886407139s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:14.884512898 +0000 UTC m=+153.553764153" watchObservedRunningTime="2025-12-03 12:17:14.886407139 +0000 UTC m=+153.555658394" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.888525 4711 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q47nn container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.888579 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" podUID="14ef9276-a825-478b-bee0-674657696c8e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.906849 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.945218 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.946578 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.446560236 +0000 UTC m=+154.115811501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:14 crc kubenswrapper[4711]: I1203 12:17:14.946769 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:14 crc kubenswrapper[4711]: E1203 12:17:14.950854 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.45083873 +0000 UTC m=+154.120089985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:14.993769 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.000241 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.048370 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.048546 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" podStartSLOduration=133.048523254 podStartE2EDuration="2m13.048523254s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.01973746 +0000 UTC m=+153.688988735" watchObservedRunningTime="2025-12-03 12:17:15.048523254 +0000 UTC m=+153.717774509" Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.048740 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.548723909 +0000 UTC m=+154.217975164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.084688 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dn4mb" podStartSLOduration=133.084672674 podStartE2EDuration="2m13.084672674s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.079661661 +0000 UTC m=+153.748912916" watchObservedRunningTime="2025-12-03 12:17:15.084672674 +0000 UTC m=+153.753923929" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.146264 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" podStartSLOduration=133.146249179 podStartE2EDuration="2m13.146249179s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.143874366 +0000 UTC m=+153.813125641" watchObservedRunningTime="2025-12-03 12:17:15.146249179 +0000 UTC m=+153.815500434" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.152749 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.153105 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.65309505 +0000 UTC m=+154.322346305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.170695 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s24hc" podStartSLOduration=133.170674287 podStartE2EDuration="2m13.170674287s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.165363817 +0000 UTC m=+153.834615072" watchObservedRunningTime="2025-12-03 12:17:15.170674287 +0000 UTC m=+153.839925542" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.189252 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5wz5c"] Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.195191 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.201421 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5f942" podStartSLOduration=133.201393533 podStartE2EDuration="2m13.201393533s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.194273464 +0000 UTC m=+153.863524729" watchObservedRunningTime="2025-12-03 12:17:15.201393533 +0000 UTC m=+153.870644788" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.201459 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wz5c"] Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.202969 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.254230 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.254644 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.754627196 +0000 UTC m=+154.423878451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.302768 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" podStartSLOduration=133.302745105 podStartE2EDuration="2m13.302745105s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.293319774 +0000 UTC m=+153.962571049" watchObservedRunningTime="2025-12-03 12:17:15.302745105 +0000 UTC m=+153.971996360" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.337599 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-56rtb" podStartSLOduration=133.337580509 podStartE2EDuration="2m13.337580509s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.336041358 +0000 UTC m=+154.005292613" watchObservedRunningTime="2025-12-03 12:17:15.337580509 +0000 UTC m=+154.006831764" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.358726 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb65q\" (UniqueName: \"kubernetes.io/projected/61b13e0f-7d37-469f-bf80-3c4b455c34b0-kube-api-access-rb65q\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.358961 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.358987 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-utilities\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.359014 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-catalog-content\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.359299 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.859287896 +0000 UTC m=+154.528539151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.399992 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zrh5"] Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.401086 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.413839 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.425573 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zrh5"] Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.462436 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.462625 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb65q\" (UniqueName: \"kubernetes.io/projected/61b13e0f-7d37-469f-bf80-3c4b455c34b0-kube-api-access-rb65q\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.462666 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-utilities\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.462693 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-catalog-content\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.463128 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-catalog-content\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.463329 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-utilities\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.463389 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:15.96337484 +0000 UTC m=+154.632626085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.468075 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" podStartSLOduration=133.468041904 podStartE2EDuration="2m13.468041904s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.460395651 +0000 UTC m=+154.129646906" watchObservedRunningTime="2025-12-03 12:17:15.468041904 +0000 UTC m=+154.137293159" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.542162 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb65q\" (UniqueName: \"kubernetes.io/projected/61b13e0f-7d37-469f-bf80-3c4b455c34b0-kube-api-access-rb65q\") pod \"community-operators-5wz5c\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.549651 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" podStartSLOduration=133.54963427 podStartE2EDuration="2m13.54963427s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:15.548802658 +0000 UTC m=+154.218053913" watchObservedRunningTime="2025-12-03 12:17:15.54963427 +0000 UTC m=+154.218885525" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.575574 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-utilities\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.576070 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-catalog-content\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.576151 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxqdb\" (UniqueName: \"kubernetes.io/projected/fff3e844-ad61-4914-aaff-fdd90e3a9a58-kube-api-access-sxqdb\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.576260 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.576639 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.076626286 +0000 UTC m=+154.745877541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.608745 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.642769 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rvlj8"] Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.644524 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.664046 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvlj8"] Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.680034 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.680223 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.180193977 +0000 UTC m=+154.849445222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.680351 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-catalog-content\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.680386 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqdb\" (UniqueName: \"kubernetes.io/projected/fff3e844-ad61-4914-aaff-fdd90e3a9a58-kube-api-access-sxqdb\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.680450 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.680518 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-utilities\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.681059 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-utilities\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.681354 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-catalog-content\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.681853 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.181842801 +0000 UTC m=+154.851094056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.723384 4711 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9zjpk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.723441 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" podUID="1cdfc9ef-2385-43ab-ad70-912c119522c7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.740958 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqdb\" (UniqueName: \"kubernetes.io/projected/fff3e844-ad61-4914-aaff-fdd90e3a9a58-kube-api-access-sxqdb\") pod \"certified-operators-7zrh5\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.756327 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:15 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:15 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:15 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.756377 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.773168 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.782402 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.782672 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.282646577 +0000 UTC m=+154.951897832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.782944 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.782978 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-utilities\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.783096 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-catalog-content\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.783133 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnkbr\" (UniqueName: \"kubernetes.io/projected/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-kube-api-access-gnkbr\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.783354 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.283342696 +0000 UTC m=+154.952593951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.794825 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bd87v"] Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.795794 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.836072 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd87v"] Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.860641 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" event={"ID":"2975f7c8-1440-4eac-bccb-8746f327f768","Type":"ContainerStarted","Data":"c17aabc4f2a3ac5e0e5e1da81634807185c535372de3b6b9f8f19924c5c28dd9"} Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.865337 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fh86s" event={"ID":"d641a4b9-732f-4cb2-890c-4170b1450aea","Type":"ContainerStarted","Data":"61c7b97229587d10625666a4d082e8936c9cddee6e09ed01c2a068833d8954e2"} Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.865388 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fh86s" event={"ID":"d641a4b9-732f-4cb2-890c-4170b1450aea","Type":"ContainerStarted","Data":"5cde168a6a756811cf546a9fbd10219fa5ae36282514fa9f8b223824f71abee2"} Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.870583 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.885516 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.885654 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsfdv\" (UniqueName: \"kubernetes.io/projected/25415f1a-641d-41b6-b91c-a48b76a9d465-kube-api-access-rsfdv\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.885679 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-utilities\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.885732 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-utilities\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.885770 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-catalog-content\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.885787 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-catalog-content\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.885815 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnkbr\" (UniqueName: \"kubernetes.io/projected/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-kube-api-access-gnkbr\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: E1203 12:17:15.886241 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.386228098 +0000 UTC m=+155.055479353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.886588 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-utilities\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.886789 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-catalog-content\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.902941 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" event={"ID":"e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69","Type":"ContainerStarted","Data":"421c73fe1d5e682e7b4b46d477cf40c69e5da0ff40e0340d48b9af4ca10e0885"} Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.904263 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jdq9h" event={"ID":"6ba64eaf-060d-417e-8831-f67b27012082","Type":"ContainerStarted","Data":"38a012a4898c7e0e7927c897dd4a5d718a1916fba2115ad2c187023b209c46cf"} Dec 03 12:17:15 crc kubenswrapper[4711]: I1203 12:17:15.949688 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnkbr\" (UniqueName: \"kubernetes.io/projected/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-kube-api-access-gnkbr\") pod \"community-operators-rvlj8\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:15.999523 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-catalog-content\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:15.999596 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsfdv\" (UniqueName: \"kubernetes.io/projected/25415f1a-641d-41b6-b91c-a48b76a9d465-kube-api-access-rsfdv\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:15.999616 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-utilities\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:15.999658 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.000473 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" event={"ID":"8e5bc284-86b9-486f-aff2-5f277e323011","Type":"ContainerStarted","Data":"70a197b911acd1b1d1ad59282cd7edabb71063ec0baa04ee4fe5dff1b87dabe9"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.000503 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" event={"ID":"8e5bc284-86b9-486f-aff2-5f277e323011","Type":"ContainerStarted","Data":"3ddfadf3ad66641f70eeff94b16ea87f0e48fd9cd8b7b633a5d1ee852d882ad4"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.000799 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-catalog-content\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.001679 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-utilities\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.001997 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.501978391 +0000 UTC m=+155.171229696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.003826 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" event={"ID":"ab93fa35-7435-4886-bcd7-a13ca47bd8fd","Type":"ContainerStarted","Data":"6dc1e9fee7d92e24b46b0fafbcc0d535431de57660b1be7dce62122607c9510a"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.021650 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" event={"ID":"14ef9276-a825-478b-bee0-674657696c8e","Type":"ContainerStarted","Data":"18db3ca2b1e0dffa1d9cdfa3dce48a44031c3da09c0a8489ebeafef4fc82b02e"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.024408 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.066650 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" event={"ID":"10e059c5-9cf9-46f7-b31a-b97aae40f17a","Type":"ContainerStarted","Data":"1311a35a38d0a8bc8cf9e36e94abf13b1bbdfa2c3916b33e866bf57887bbf9ea"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.085200 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" event={"ID":"9bc96e8b-e5df-40a7-8690-530895999a16","Type":"ContainerStarted","Data":"b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.086356 4711 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-c8g7k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.086386 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.103601 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.104806 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.604787481 +0000 UTC m=+155.274038746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.120926 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" event={"ID":"9ec52f86-45ff-4a9d-8913-a8848e0589e7","Type":"ContainerStarted","Data":"8a01f4cc0183eca4a29ed31010f967893d8037fa80d7b56b2b42a534386f7138"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.121134 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.132068 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsfdv\" (UniqueName: \"kubernetes.io/projected/25415f1a-641d-41b6-b91c-a48b76a9d465-kube-api-access-rsfdv\") pod \"certified-operators-bd87v\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.136145 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.152520 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zzqqb" podStartSLOduration=134.152506989 podStartE2EDuration="2m14.152506989s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.152242482 +0000 UTC m=+154.821493737" watchObservedRunningTime="2025-12-03 12:17:16.152506989 +0000 UTC m=+154.821758244" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.175777 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" event={"ID":"04039569-bc9d-4bbe-972a-235add83f1b8","Type":"ContainerStarted","Data":"571487581dff56ab4a78e74bfbdedd327c5ca21495e9ea0b3ffef2feb17bf1ba"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.205694 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.207889 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.707877369 +0000 UTC m=+155.377128624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.218191 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" event={"ID":"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95","Type":"ContainerStarted","Data":"bfc6688eed32d5f0ac91702d200362720c60ef8c668a84d60496f0342e32e430"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.220516 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" event={"ID":"9cdc9430-275b-4891-858d-a68d9148058e","Type":"ContainerStarted","Data":"7db47eda0b0b7e1ef2cca64fcee55057e194dc2db348261c4cffacb887eb7edc"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.221930 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" event={"ID":"1a334f3a-66dc-447a-822c-c22cc1f4fd2f","Type":"ContainerStarted","Data":"f4b5d8e4c245d5f02b9620f30ea5d83e423675865c4ca793d3b4582405c82d29"} Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.233536 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-59vns" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.302660 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jp8jl" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.306715 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q47nn" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.309814 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.310668 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.810647407 +0000 UTC m=+155.479898652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.311338 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.311777 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.811764898 +0000 UTC m=+155.481016153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.324789 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" podStartSLOduration=134.324770442 podStartE2EDuration="2m14.324770442s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.292849355 +0000 UTC m=+154.962100620" watchObservedRunningTime="2025-12-03 12:17:16.324770442 +0000 UTC m=+154.994021717" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.325280 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fh86s" podStartSLOduration=9.325270836 podStartE2EDuration="9.325270836s" podCreationTimestamp="2025-12-03 12:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.32429378 +0000 UTC m=+154.993545055" watchObservedRunningTime="2025-12-03 12:17:16.325270836 +0000 UTC m=+154.994522091" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.414607 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.415979 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:16.915954274 +0000 UTC m=+155.585205589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.444714 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nmlhj" podStartSLOduration=134.444693607 podStartE2EDuration="2m14.444693607s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.443931137 +0000 UTC m=+155.113182392" watchObservedRunningTime="2025-12-03 12:17:16.444693607 +0000 UTC m=+155.113944862" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.478259 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9zjpk" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.488148 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" podStartSLOduration=134.4881257 podStartE2EDuration="2m14.4881257s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.480520898 +0000 UTC m=+155.149772173" watchObservedRunningTime="2025-12-03 12:17:16.4881257 +0000 UTC m=+155.157376965" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.517317 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.517669 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.017654184 +0000 UTC m=+155.686905439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.564172 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmpjf" podStartSLOduration=134.564155619 podStartE2EDuration="2m14.564155619s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.51070002 +0000 UTC m=+155.179951295" watchObservedRunningTime="2025-12-03 12:17:16.564155619 +0000 UTC m=+155.233406874" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.566584 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hm68r" podStartSLOduration=134.566577013 podStartE2EDuration="2m14.566577013s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.56344149 +0000 UTC m=+155.232692755" watchObservedRunningTime="2025-12-03 12:17:16.566577013 +0000 UTC m=+155.235828258" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.618428 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.618750 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.118735398 +0000 UTC m=+155.787986643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.674077 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mjzpn" podStartSLOduration=134.674060707 podStartE2EDuration="2m14.674060707s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.667094793 +0000 UTC m=+155.336346068" watchObservedRunningTime="2025-12-03 12:17:16.674060707 +0000 UTC m=+155.343311962" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.725181 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.725521 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.225507623 +0000 UTC m=+155.894758878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.735375 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7brcf" podStartSLOduration=134.735354495 podStartE2EDuration="2m14.735354495s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:16.734365619 +0000 UTC m=+155.403616894" watchObservedRunningTime="2025-12-03 12:17:16.735354495 +0000 UTC m=+155.404605770" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.769087 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:16 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:16 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:16 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.769150 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.830489 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.831023 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.331001154 +0000 UTC m=+156.000252419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.831088 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.831401 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.331391844 +0000 UTC m=+156.000643099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.933006 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:16 crc kubenswrapper[4711]: E1203 12:17:16.933384 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.433368153 +0000 UTC m=+156.102619408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.941298 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wz5c"] Dec 03 12:17:16 crc kubenswrapper[4711]: W1203 12:17:16.965041 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b13e0f_7d37_469f_bf80_3c4b455c34b0.slice/crio-f78418c826d5cfb145a831a282ba6c81568ab0b3dee8da93303158c7c239db90 WatchSource:0}: Error finding container f78418c826d5cfb145a831a282ba6c81568ab0b3dee8da93303158c7c239db90: Status 404 returned error can't find the container with id f78418c826d5cfb145a831a282ba6c81568ab0b3dee8da93303158c7c239db90 Dec 03 12:17:16 crc kubenswrapper[4711]: I1203 12:17:16.972651 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zrh5"] Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.041421 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.041701 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.541689078 +0000 UTC m=+156.210940333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.142618 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.142863 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.642828883 +0000 UTC m=+156.312080138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.143372 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.143734 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.643723438 +0000 UTC m=+156.312974693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.208102 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rjr4q"] Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.209043 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.213286 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.245536 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.245935 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.74589659 +0000 UTC m=+156.415147845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.254704 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjr4q"] Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.267940 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bs88v" event={"ID":"30e431ee-b97c-4e18-91b0-ebc41960d7a5","Type":"ContainerStarted","Data":"f4ae789304feda80c6f2c2c78b42abc7c59f14622aa5da9a6eac6e6fc7043e95"} Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.290921 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" event={"ID":"9cdc9430-275b-4891-858d-a68d9148058e","Type":"ContainerStarted","Data":"549ce60dd7bae5110143ed7242ee70e53d0a34dee1839471e1bf63f4cd7200b8"} Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.305817 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wz5c" event={"ID":"61b13e0f-7d37-469f-bf80-3c4b455c34b0","Type":"ContainerStarted","Data":"f78418c826d5cfb145a831a282ba6c81568ab0b3dee8da93303158c7c239db90"} Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.320363 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zrh5" event={"ID":"fff3e844-ad61-4914-aaff-fdd90e3a9a58","Type":"ContainerStarted","Data":"073afe88fe57646267b73da04a8f4260fb1ae70c4e75248a2bbb3ca6c2139e7b"} Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.321901 4711 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-c8g7k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.321956 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.340833 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvlj8"] Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.347900 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.348022 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4mgx\" (UniqueName: \"kubernetes.io/projected/0908002c-11e7-4804-a059-dbd19dc6d739-kube-api-access-v4mgx\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.348050 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-catalog-content\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.348127 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-utilities\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.349173 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.849151042 +0000 UTC m=+156.518402337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.424646 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-45rj8" podStartSLOduration=135.424630736 podStartE2EDuration="2m15.424630736s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:17.421234206 +0000 UTC m=+156.090485471" watchObservedRunningTime="2025-12-03 12:17:17.424630736 +0000 UTC m=+156.093881991" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.427322 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd87v"] Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.458359 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.458729 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4mgx\" (UniqueName: \"kubernetes.io/projected/0908002c-11e7-4804-a059-dbd19dc6d739-kube-api-access-v4mgx\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.458817 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-catalog-content\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.459059 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-utilities\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.459724 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:17.959704998 +0000 UTC m=+156.628956253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.465139 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-utilities\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.466646 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-catalog-content\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: W1203 12:17:17.495461 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25415f1a_641d_41b6_b91c_a48b76a9d465.slice/crio-3e534c9314fe4453cd8e0bb3ef4ce06ce82a5474ac3031c4ea5504af743a47bc WatchSource:0}: Error finding container 3e534c9314fe4453cd8e0bb3ef4ce06ce82a5474ac3031c4ea5504af743a47bc: Status 404 returned error can't find the container with id 3e534c9314fe4453cd8e0bb3ef4ce06ce82a5474ac3031c4ea5504af743a47bc Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.539941 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4mgx\" (UniqueName: \"kubernetes.io/projected/0908002c-11e7-4804-a059-dbd19dc6d739-kube-api-access-v4mgx\") pod \"redhat-marketplace-rjr4q\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.566220 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.566519 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.066508304 +0000 UTC m=+156.735759559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.668244 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.668832 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.168811889 +0000 UTC m=+156.838063144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.720874 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mj5ds"] Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.721893 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.769179 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:17 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:17 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:17 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.769227 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.769819 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.770133 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.270121509 +0000 UTC m=+156.939372764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.779641 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj5ds"] Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.828286 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.878398 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.878924 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmwg5\" (UniqueName: \"kubernetes.io/projected/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-kube-api-access-hmwg5\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.878961 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-catalog-content\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.879014 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.37898852 +0000 UTC m=+157.048239765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.879094 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-utilities\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.879132 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.879390 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.379383821 +0000 UTC m=+157.048635076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.980437 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.980574 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-utilities\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.980604 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.480578247 +0000 UTC m=+157.149829502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.980651 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.980734 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmwg5\" (UniqueName: \"kubernetes.io/projected/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-kube-api-access-hmwg5\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.980777 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-catalog-content\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:17 crc kubenswrapper[4711]: E1203 12:17:17.981018 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.48100392 +0000 UTC m=+157.150255175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.981164 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-catalog-content\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:17 crc kubenswrapper[4711]: I1203 12:17:17.981291 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-utilities\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.010789 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmwg5\" (UniqueName: \"kubernetes.io/projected/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-kube-api-access-hmwg5\") pod \"redhat-marketplace-mj5ds\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.081881 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.082267 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.582252638 +0000 UTC m=+157.251503893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.136956 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.184012 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.184385 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.684373249 +0000 UTC m=+157.353624504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.284753 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.284954 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.784923149 +0000 UTC m=+157.454174414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.285048 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.285354 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.78534209 +0000 UTC m=+157.454593345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.346211 4711 generic.go:334] "Generic (PLEG): container finished" podID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerID="7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda" exitCode=0 Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.346264 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zrh5" event={"ID":"fff3e844-ad61-4914-aaff-fdd90e3a9a58","Type":"ContainerDied","Data":"7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda"} Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.351347 4711 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.356422 4711 generic.go:334] "Generic (PLEG): container finished" podID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerID="462878b8f0b07442429d2b67848e8d18efc160e0a73a517d30d1a84a99fe222e" exitCode=0 Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.356471 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd87v" event={"ID":"25415f1a-641d-41b6-b91c-a48b76a9d465","Type":"ContainerDied","Data":"462878b8f0b07442429d2b67848e8d18efc160e0a73a517d30d1a84a99fe222e"} Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.356495 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd87v" event={"ID":"25415f1a-641d-41b6-b91c-a48b76a9d465","Type":"ContainerStarted","Data":"3e534c9314fe4453cd8e0bb3ef4ce06ce82a5474ac3031c4ea5504af743a47bc"} Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.373111 4711 generic.go:334] "Generic (PLEG): container finished" podID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerID="e708d292c30e2c36c131fce0bb662701bb89b7c5d718099e9882164e03dfe08f" exitCode=0 Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.373184 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvlj8" event={"ID":"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02","Type":"ContainerDied","Data":"e708d292c30e2c36c131fce0bb662701bb89b7c5d718099e9882164e03dfe08f"} Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.373209 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvlj8" event={"ID":"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02","Type":"ContainerStarted","Data":"120bd45357dcacfdb2c5383883983a569e1ef2850f52e1e26f71cb7465dc539f"} Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.381313 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lx4g4"] Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.381523 4711 generic.go:334] "Generic (PLEG): container finished" podID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerID="63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6" exitCode=0 Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.383070 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wz5c" event={"ID":"61b13e0f-7d37-469f-bf80-3c4b455c34b0","Type":"ContainerDied","Data":"63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6"} Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.383138 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.392216 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.392782 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.393438 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.893424191 +0000 UTC m=+157.562675446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.393625 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.395185 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.895178237 +0000 UTC m=+157.564429492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.421408 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lx4g4"] Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.496504 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.497062 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-catalog-content\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.497147 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9cs\" (UniqueName: \"kubernetes.io/projected/559a1102-7df3-414e-9a4c-37cd0d35daf1-kube-api-access-bt9cs\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.497192 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-utilities\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.497698 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:18.997677608 +0000 UTC m=+157.666928873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.599806 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-utilities\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.599863 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-catalog-content\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.599887 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.600016 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt9cs\" (UniqueName: \"kubernetes.io/projected/559a1102-7df3-414e-9a4c-37cd0d35daf1-kube-api-access-bt9cs\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.600656 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-utilities\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.600876 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-catalog-content\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.601139 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.101129656 +0000 UTC m=+157.770380911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.657139 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt9cs\" (UniqueName: \"kubernetes.io/projected/559a1102-7df3-414e-9a4c-37cd0d35daf1-kube-api-access-bt9cs\") pod \"redhat-operators-lx4g4\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.701522 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.701880 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.20186284 +0000 UTC m=+157.871114095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.756370 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:18 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:18 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:18 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.756434 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.760249 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.784578 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vwwvw"] Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.789669 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.807208 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-utilities\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.807251 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-catalog-content\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.807276 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsz8v\" (UniqueName: \"kubernetes.io/projected/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-kube-api-access-lsz8v\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.807336 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.807675 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.30766046 +0000 UTC m=+157.976911715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.810677 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwwvw"] Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.911506 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.911939 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-utilities\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.911971 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-catalog-content\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.911989 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsz8v\" (UniqueName: \"kubernetes.io/projected/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-kube-api-access-lsz8v\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.914313 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-catalog-content\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: E1203 12:17:18.914380 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.414364443 +0000 UTC m=+158.083615698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.914400 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-utilities\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:18 crc kubenswrapper[4711]: I1203 12:17:18.931544 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsz8v\" (UniqueName: \"kubernetes.io/projected/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-kube-api-access-lsz8v\") pod \"redhat-operators-vwwvw\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.012945 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.013285 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.513274529 +0000 UTC m=+158.182525784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.113613 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.113804 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.613775108 +0000 UTC m=+158.283026383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.114093 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.114433 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.614421825 +0000 UTC m=+158.283673080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.153392 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjr4q"] Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.203129 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.214738 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.215171 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.71515124 +0000 UTC m=+158.384402505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.251170 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.251858 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.254267 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.254681 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.315972 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.316486 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.81647321 +0000 UTC m=+158.485724465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: W1203 12:17:19.375767 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0908002c_11e7_4804_a059_dbd19dc6d739.slice/crio-29027ae30e0d1f37b8a81cbacef24ec9e9bdb46401e721a65c2241ed3b8b3a17 WatchSource:0}: Error finding container 29027ae30e0d1f37b8a81cbacef24ec9e9bdb46401e721a65c2241ed3b8b3a17: Status 404 returned error can't find the container with id 29027ae30e0d1f37b8a81cbacef24ec9e9bdb46401e721a65c2241ed3b8b3a17 Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.379464 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.421441 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.421809 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.421834 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.421991 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:19.921972882 +0000 UTC m=+158.591224137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.441218 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lx4g4"] Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.451018 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjr4q" event={"ID":"0908002c-11e7-4804-a059-dbd19dc6d739","Type":"ContainerStarted","Data":"29027ae30e0d1f37b8a81cbacef24ec9e9bdb46401e721a65c2241ed3b8b3a17"} Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.487158 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bs88v" event={"ID":"30e431ee-b97c-4e18-91b0-ebc41960d7a5","Type":"ContainerStarted","Data":"9cbd78cc7fb7e289b58eccfe3528a47adc24012583e6244165694122d88a9d35"} Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.531161 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.531231 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.531253 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.532141 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:20.032129327 +0000 UTC m=+158.701380582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.532335 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.610819 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.624345 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.632574 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.633051 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:20.133032216 +0000 UTC m=+158.802283471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.637091 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj5ds"] Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.738669 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.739468 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:20.239447262 +0000 UTC m=+158.908698587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.740097 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.758550 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:19 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:19 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:19 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.758604 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.840779 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.841152 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:17:20.341137462 +0000 UTC m=+159.010388717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.864758 4711 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.906800 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.907188 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.923803 4711 patch_prober.go:28] interesting pod/console-f9d7485db-fkpkw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.923856 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fkpkw" podUID="96380b31-f010-408a-b4b2-af721875ec8c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 03 12:17:19 crc kubenswrapper[4711]: I1203 12:17:19.943280 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:19 crc kubenswrapper[4711]: E1203 12:17:19.943613 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:17:20.443601353 +0000 UTC m=+159.112852608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sjc5j" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.007085 4711 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T12:17:19.86478052Z","Handler":null,"Name":""} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.010326 4711 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.010360 4711 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.044475 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.052687 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.056286 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwwvw"] Dec 03 12:17:20 crc kubenswrapper[4711]: W1203 12:17:20.104726 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ded0656_834a_4d3d_8cf8_71ed8075e3aa.slice/crio-acffa7977af31e9a837422ff0ccd2777a2610e3c34d90c11511a890c9835692c WatchSource:0}: Error finding container acffa7977af31e9a837422ff0ccd2777a2610e3c34d90c11511a890c9835692c: Status 404 returned error can't find the container with id acffa7977af31e9a837422ff0ccd2777a2610e3c34d90c11511a890c9835692c Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.141218 4711 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6jjm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.141225 4711 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6jjm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.141257 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x6jjm" podUID="8cc669b1-f54c-476c-ae6c-81f5bf982b7e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.141276 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x6jjm" podUID="8cc669b1-f54c-476c-ae6c-81f5bf982b7e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.149665 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.153360 4711 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.153396 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.213827 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sjc5j\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.258423 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.258471 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.274083 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.276228 4711 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gx6sm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]log ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]etcd ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/max-in-flight-filter ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 03 12:17:20 crc kubenswrapper[4711]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 03 12:17:20 crc kubenswrapper[4711]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/project.openshift.io-projectcache ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/openshift.io-startinformers ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 03 12:17:20 crc kubenswrapper[4711]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 12:17:20 crc kubenswrapper[4711]: livez check failed Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.276291 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" podUID="e4ac6509-fa27-4ca6-8cfa-dfc4f168ae69" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.347689 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.573881 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bs88v" event={"ID":"30e431ee-b97c-4e18-91b0-ebc41960d7a5","Type":"ContainerStarted","Data":"b7fa4ca10e328b05b778423144c5ee1af190f2ef3cb78fcdc84a1706ed6cc6cc"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.573950 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bs88v" event={"ID":"30e431ee-b97c-4e18-91b0-ebc41960d7a5","Type":"ContainerStarted","Data":"23c0a7fb53ed14616d2eee531bccfb2db5e606b578790ab1d383ad065da28851"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.579630 4711 generic.go:334] "Generic (PLEG): container finished" podID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerID="2cc7e6508d69dc7d3ca5a7984c4ee779f6f7609455b419da712739d2403b8c46" exitCode=0 Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.579805 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwvw" event={"ID":"1ded0656-834a-4d3d-8cf8-71ed8075e3aa","Type":"ContainerDied","Data":"2cc7e6508d69dc7d3ca5a7984c4ee779f6f7609455b419da712739d2403b8c46"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.579838 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwvw" event={"ID":"1ded0656-834a-4d3d-8cf8-71ed8075e3aa","Type":"ContainerStarted","Data":"acffa7977af31e9a837422ff0ccd2777a2610e3c34d90c11511a890c9835692c"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.586043 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fc5b02f5-74ca-428f-98ef-87a69ceadcb1","Type":"ContainerStarted","Data":"66d7f1ee5620a7f9282730972b1afc80f88a27bb5aafbaa96467cda5ccd63b22"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.593729 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bs88v" podStartSLOduration=12.593708265 podStartE2EDuration="12.593708265s" podCreationTimestamp="2025-12-03 12:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:20.592057262 +0000 UTC m=+159.261308537" watchObservedRunningTime="2025-12-03 12:17:20.593708265 +0000 UTC m=+159.262959530" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.599491 4711 generic.go:334] "Generic (PLEG): container finished" podID="a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95" containerID="bfc6688eed32d5f0ac91702d200362720c60ef8c668a84d60496f0342e32e430" exitCode=0 Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.599570 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" event={"ID":"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95","Type":"ContainerDied","Data":"bfc6688eed32d5f0ac91702d200362720c60ef8c668a84d60496f0342e32e430"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.603810 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.612718 4711 generic.go:334] "Generic (PLEG): container finished" podID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerID="0202e5e6d5b30133473dd8129a6116bf4cb5d7e9a43b6cf4e6f47591b11e51e6" exitCode=0 Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.612767 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj5ds" event={"ID":"3d15af25-55a8-4d30-a83e-3bdf5dc963fc","Type":"ContainerDied","Data":"0202e5e6d5b30133473dd8129a6116bf4cb5d7e9a43b6cf4e6f47591b11e51e6"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.612813 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj5ds" event={"ID":"3d15af25-55a8-4d30-a83e-3bdf5dc963fc","Type":"ContainerStarted","Data":"3fb60af4000c47fdfd0d63ae6495519e3d6be6b54c380c075d789e2dcc034d98"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.614898 4711 generic.go:334] "Generic (PLEG): container finished" podID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerID="289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5" exitCode=0 Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.615110 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4g4" event={"ID":"559a1102-7df3-414e-9a4c-37cd0d35daf1","Type":"ContainerDied","Data":"289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.615151 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4g4" event={"ID":"559a1102-7df3-414e-9a4c-37cd0d35daf1","Type":"ContainerStarted","Data":"ca60723c484f77fcfbbf06f17c6ac8898f63c0157a2379ccd0a016d1e2a89700"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.627491 4711 generic.go:334] "Generic (PLEG): container finished" podID="0908002c-11e7-4804-a059-dbd19dc6d739" containerID="db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7" exitCode=0 Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.627566 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjr4q" event={"ID":"0908002c-11e7-4804-a059-dbd19dc6d739","Type":"ContainerDied","Data":"db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7"} Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.757482 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.761770 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:20 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:20 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:20 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.761814 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:20 crc kubenswrapper[4711]: I1203 12:17:20.816712 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sjc5j"] Dec 03 12:17:20 crc kubenswrapper[4711]: W1203 12:17:20.841689 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58cccb4_9efa_494f_98c3_fbf61d444804.slice/crio-82449a86e4ad3618ca2f5d52c23fc3831346f9a66c3eb2fffa6b2635759c23de WatchSource:0}: Error finding container 82449a86e4ad3618ca2f5d52c23fc3831346f9a66c3eb2fffa6b2635759c23de: Status 404 returned error can't find the container with id 82449a86e4ad3618ca2f5d52c23fc3831346f9a66c3eb2fffa6b2635759c23de Dec 03 12:17:21 crc kubenswrapper[4711]: I1203 12:17:21.637416 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fc5b02f5-74ca-428f-98ef-87a69ceadcb1","Type":"ContainerStarted","Data":"bb48d76bccc37ef42e0eea67a1c7c3b74ce8fd17ed28caf2564a8cfce66016d8"} Dec 03 12:17:21 crc kubenswrapper[4711]: I1203 12:17:21.644760 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" event={"ID":"e58cccb4-9efa-494f-98c3-fbf61d444804","Type":"ContainerStarted","Data":"e7931db9c66da5319a22fe22c95d85d77a16d46818f1ca9bf4cb413fdc4b0b0a"} Dec 03 12:17:21 crc kubenswrapper[4711]: I1203 12:17:21.644819 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" event={"ID":"e58cccb4-9efa-494f-98c3-fbf61d444804","Type":"ContainerStarted","Data":"82449a86e4ad3618ca2f5d52c23fc3831346f9a66c3eb2fffa6b2635759c23de"} Dec 03 12:17:21 crc kubenswrapper[4711]: I1203 12:17:21.658813 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.658789257 podStartE2EDuration="2.658789257s" podCreationTimestamp="2025-12-03 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:21.652925421 +0000 UTC m=+160.322176686" watchObservedRunningTime="2025-12-03 12:17:21.658789257 +0000 UTC m=+160.328040522" Dec 03 12:17:21 crc kubenswrapper[4711]: I1203 12:17:21.688210 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" podStartSLOduration=139.688183748 podStartE2EDuration="2m19.688183748s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:21.685200988 +0000 UTC m=+160.354452273" watchObservedRunningTime="2025-12-03 12:17:21.688183748 +0000 UTC m=+160.357435003" Dec 03 12:17:21 crc kubenswrapper[4711]: I1203 12:17:21.759612 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:21 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:21 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:21 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:21 crc kubenswrapper[4711]: I1203 12:17:21.759684 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:21 crc kubenswrapper[4711]: I1203 12:17:21.832632 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.182325 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.306361 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-secret-volume\") pod \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.306420 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjnsj\" (UniqueName: \"kubernetes.io/projected/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-kube-api-access-gjnsj\") pod \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.306518 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-config-volume\") pod \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\" (UID: \"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95\") " Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.307624 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-config-volume" (OuterVolumeSpecName: "config-volume") pod "a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95" (UID: "a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.312963 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95" (UID: "a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.314971 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-kube-api-access-gjnsj" (OuterVolumeSpecName: "kube-api-access-gjnsj") pod "a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95" (UID: "a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95"). InnerVolumeSpecName "kube-api-access-gjnsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.409670 4711 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.409715 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjnsj\" (UniqueName: \"kubernetes.io/projected/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-kube-api-access-gjnsj\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.409727 4711 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.742986 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" event={"ID":"a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95","Type":"ContainerDied","Data":"6b05814fa7f6ebe354236343a49e6ec344be2ae41fd36db28ee2c72d6a7ca011"} Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.743646 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b05814fa7f6ebe354236343a49e6ec344be2ae41fd36db28ee2c72d6a7ca011" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.743672 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.743131 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8" Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.755960 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:22 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:22 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:22 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:22 crc kubenswrapper[4711]: I1203 12:17:22.756143 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:23 crc kubenswrapper[4711]: I1203 12:17:23.756371 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:23 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:23 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:23 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:23 crc kubenswrapper[4711]: I1203 12:17:23.756427 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:23 crc kubenswrapper[4711]: I1203 12:17:23.786662 4711 generic.go:334] "Generic (PLEG): container finished" podID="fc5b02f5-74ca-428f-98ef-87a69ceadcb1" containerID="bb48d76bccc37ef42e0eea67a1c7c3b74ce8fd17ed28caf2564a8cfce66016d8" exitCode=0 Dec 03 12:17:23 crc kubenswrapper[4711]: I1203 12:17:23.786705 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fc5b02f5-74ca-428f-98ef-87a69ceadcb1","Type":"ContainerDied","Data":"bb48d76bccc37ef42e0eea67a1c7c3b74ce8fd17ed28caf2564a8cfce66016d8"} Dec 03 12:17:24 crc kubenswrapper[4711]: I1203 12:17:24.652024 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:17:24 crc kubenswrapper[4711]: I1203 12:17:24.657105 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdb7f01e-b2fd-49da-b7de-621da238d797-metrics-certs\") pod \"network-metrics-daemon-wd9tz\" (UID: \"cdb7f01e-b2fd-49da-b7de-621da238d797\") " pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:17:24 crc kubenswrapper[4711]: I1203 12:17:24.773183 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:24 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:24 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:24 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:24 crc kubenswrapper[4711]: I1203 12:17:24.773485 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:24 crc kubenswrapper[4711]: I1203 12:17:24.800728 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd9tz" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.029962 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 12:17:25 crc kubenswrapper[4711]: E1203 12:17:25.030194 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95" containerName="collect-profiles" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.030206 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95" containerName="collect-profiles" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.030297 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95" containerName="collect-profiles" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.030643 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.052202 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.053000 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.059875 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.067383 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42ed72e2-32d3-4845-8659-32d51514dc3c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"42ed72e2-32d3-4845-8659-32d51514dc3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.067502 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42ed72e2-32d3-4845-8659-32d51514dc3c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"42ed72e2-32d3-4845-8659-32d51514dc3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.168488 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42ed72e2-32d3-4845-8659-32d51514dc3c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"42ed72e2-32d3-4845-8659-32d51514dc3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.168608 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42ed72e2-32d3-4845-8659-32d51514dc3c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"42ed72e2-32d3-4845-8659-32d51514dc3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.168684 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42ed72e2-32d3-4845-8659-32d51514dc3c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"42ed72e2-32d3-4845-8659-32d51514dc3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.190558 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42ed72e2-32d3-4845-8659-32d51514dc3c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"42ed72e2-32d3-4845-8659-32d51514dc3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.273599 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.283127 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gx6sm" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.372107 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.470190 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wd9tz"] Dec 03 12:17:25 crc kubenswrapper[4711]: W1203 12:17:25.583088 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdb7f01e_b2fd_49da_b7de_621da238d797.slice/crio-ce19b4692cbfb9a6427e3558874160951ec2d8051a9e7eddaee4fef8e3e042cb WatchSource:0}: Error finding container ce19b4692cbfb9a6427e3558874160951ec2d8051a9e7eddaee4fef8e3e042cb: Status 404 returned error can't find the container with id ce19b4692cbfb9a6427e3558874160951ec2d8051a9e7eddaee4fef8e3e042cb Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.749537 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.757723 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:25 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:25 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:25 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.757772 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.781201 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kube-api-access\") pod \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\" (UID: \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\") " Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.781244 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kubelet-dir\") pod \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\" (UID: \"fc5b02f5-74ca-428f-98ef-87a69ceadcb1\") " Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.781580 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fc5b02f5-74ca-428f-98ef-87a69ceadcb1" (UID: "fc5b02f5-74ca-428f-98ef-87a69ceadcb1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.791160 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fc5b02f5-74ca-428f-98ef-87a69ceadcb1" (UID: "fc5b02f5-74ca-428f-98ef-87a69ceadcb1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.819266 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fc5b02f5-74ca-428f-98ef-87a69ceadcb1","Type":"ContainerDied","Data":"66d7f1ee5620a7f9282730972b1afc80f88a27bb5aafbaa96467cda5ccd63b22"} Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.819329 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d7f1ee5620a7f9282730972b1afc80f88a27bb5aafbaa96467cda5ccd63b22" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.819395 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.883124 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.883173 4711 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc5b02f5-74ca-428f-98ef-87a69ceadcb1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.893898 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" event={"ID":"cdb7f01e-b2fd-49da-b7de-621da238d797","Type":"ContainerStarted","Data":"ce19b4692cbfb9a6427e3558874160951ec2d8051a9e7eddaee4fef8e3e042cb"} Dec 03 12:17:25 crc kubenswrapper[4711]: I1203 12:17:25.909321 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 12:17:25 crc kubenswrapper[4711]: W1203 12:17:25.930887 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod42ed72e2_32d3_4845_8659_32d51514dc3c.slice/crio-9e8155632856525f4e844f78e2ac0472cddd9ca8b11c0a56b525fa0329f4a4d2 WatchSource:0}: Error finding container 9e8155632856525f4e844f78e2ac0472cddd9ca8b11c0a56b525fa0329f4a4d2: Status 404 returned error can't find the container with id 9e8155632856525f4e844f78e2ac0472cddd9ca8b11c0a56b525fa0329f4a4d2 Dec 03 12:17:26 crc kubenswrapper[4711]: I1203 12:17:26.073220 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fh86s" Dec 03 12:17:26 crc kubenswrapper[4711]: I1203 12:17:26.756569 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:26 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:26 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:26 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:26 crc kubenswrapper[4711]: I1203 12:17:26.757447 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:26 crc kubenswrapper[4711]: I1203 12:17:26.911388 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"42ed72e2-32d3-4845-8659-32d51514dc3c","Type":"ContainerStarted","Data":"56b4e636dc904717f7498ec32b040c240562c160474e61efb9477b4484b7fc76"} Dec 03 12:17:26 crc kubenswrapper[4711]: I1203 12:17:26.911453 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"42ed72e2-32d3-4845-8659-32d51514dc3c","Type":"ContainerStarted","Data":"9e8155632856525f4e844f78e2ac0472cddd9ca8b11c0a56b525fa0329f4a4d2"} Dec 03 12:17:27 crc kubenswrapper[4711]: I1203 12:17:27.760543 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:27 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:27 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:27 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:27 crc kubenswrapper[4711]: I1203 12:17:27.760626 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:27 crc kubenswrapper[4711]: I1203 12:17:27.925577 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" event={"ID":"cdb7f01e-b2fd-49da-b7de-621da238d797","Type":"ContainerStarted","Data":"a08d993ccbd6acd1bfaeb557392ff3a91c59b88989f9dd05ab202671705da027"} Dec 03 12:17:27 crc kubenswrapper[4711]: I1203 12:17:27.935280 4711 generic.go:334] "Generic (PLEG): container finished" podID="42ed72e2-32d3-4845-8659-32d51514dc3c" containerID="56b4e636dc904717f7498ec32b040c240562c160474e61efb9477b4484b7fc76" exitCode=0 Dec 03 12:17:27 crc kubenswrapper[4711]: I1203 12:17:27.935323 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"42ed72e2-32d3-4845-8659-32d51514dc3c","Type":"ContainerDied","Data":"56b4e636dc904717f7498ec32b040c240562c160474e61efb9477b4484b7fc76"} Dec 03 12:17:28 crc kubenswrapper[4711]: I1203 12:17:28.757483 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:28 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:28 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:28 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:28 crc kubenswrapper[4711]: I1203 12:17:28.757760 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:28 crc kubenswrapper[4711]: I1203 12:17:28.957068 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wd9tz" event={"ID":"cdb7f01e-b2fd-49da-b7de-621da238d797","Type":"ContainerStarted","Data":"c6235ff2d48540f5f36d475317635c4fb62ffc11437e2f23699d19528cd176f6"} Dec 03 12:17:29 crc kubenswrapper[4711]: I1203 12:17:29.781198 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:29 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:29 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:29 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:29 crc kubenswrapper[4711]: I1203 12:17:29.781412 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:29 crc kubenswrapper[4711]: I1203 12:17:29.915050 4711 patch_prober.go:28] interesting pod/console-f9d7485db-fkpkw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 03 12:17:29 crc kubenswrapper[4711]: I1203 12:17:29.915104 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fkpkw" podUID="96380b31-f010-408a-b4b2-af721875ec8c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 03 12:17:30 crc kubenswrapper[4711]: I1203 12:17:30.044439 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wd9tz" podStartSLOduration=148.044424644 podStartE2EDuration="2m28.044424644s" podCreationTimestamp="2025-12-03 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:30.042123522 +0000 UTC m=+168.711374777" watchObservedRunningTime="2025-12-03 12:17:30.044424644 +0000 UTC m=+168.713675889" Dec 03 12:17:30 crc kubenswrapper[4711]: I1203 12:17:30.141016 4711 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6jjm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 03 12:17:30 crc kubenswrapper[4711]: I1203 12:17:30.141074 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x6jjm" podUID="8cc669b1-f54c-476c-ae6c-81f5bf982b7e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 03 12:17:30 crc kubenswrapper[4711]: I1203 12:17:30.141428 4711 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6jjm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 03 12:17:30 crc kubenswrapper[4711]: I1203 12:17:30.141449 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x6jjm" podUID="8cc669b1-f54c-476c-ae6c-81f5bf982b7e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 03 12:17:30 crc kubenswrapper[4711]: I1203 12:17:30.754848 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:30 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:30 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:30 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:30 crc kubenswrapper[4711]: I1203 12:17:30.754942 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:31 crc kubenswrapper[4711]: I1203 12:17:31.756032 4711 patch_prober.go:28] interesting pod/router-default-5444994796-ng58r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:17:31 crc kubenswrapper[4711]: [-]has-synced failed: reason withheld Dec 03 12:17:31 crc kubenswrapper[4711]: [+]process-running ok Dec 03 12:17:31 crc kubenswrapper[4711]: healthz check failed Dec 03 12:17:31 crc kubenswrapper[4711]: I1203 12:17:31.756086 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ng58r" podUID="efe4ef43-ab30-4ce5-a54b-3aceb4ab0f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:17:33 crc kubenswrapper[4711]: I1203 12:17:33.067270 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:33 crc kubenswrapper[4711]: I1203 12:17:33.070999 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ng58r" Dec 03 12:17:35 crc kubenswrapper[4711]: I1203 12:17:35.401370 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:17:35 crc kubenswrapper[4711]: I1203 12:17:35.401733 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:17:37 crc kubenswrapper[4711]: I1203 12:17:37.421269 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:37 crc kubenswrapper[4711]: I1203 12:17:37.573946 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42ed72e2-32d3-4845-8659-32d51514dc3c-kube-api-access\") pod \"42ed72e2-32d3-4845-8659-32d51514dc3c\" (UID: \"42ed72e2-32d3-4845-8659-32d51514dc3c\") " Dec 03 12:17:37 crc kubenswrapper[4711]: I1203 12:17:37.574052 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42ed72e2-32d3-4845-8659-32d51514dc3c-kubelet-dir\") pod \"42ed72e2-32d3-4845-8659-32d51514dc3c\" (UID: \"42ed72e2-32d3-4845-8659-32d51514dc3c\") " Dec 03 12:17:37 crc kubenswrapper[4711]: I1203 12:17:37.574209 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42ed72e2-32d3-4845-8659-32d51514dc3c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "42ed72e2-32d3-4845-8659-32d51514dc3c" (UID: "42ed72e2-32d3-4845-8659-32d51514dc3c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:17:37 crc kubenswrapper[4711]: I1203 12:17:37.581128 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ed72e2-32d3-4845-8659-32d51514dc3c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "42ed72e2-32d3-4845-8659-32d51514dc3c" (UID: "42ed72e2-32d3-4845-8659-32d51514dc3c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:17:37 crc kubenswrapper[4711]: I1203 12:17:37.675246 4711 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42ed72e2-32d3-4845-8659-32d51514dc3c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:37 crc kubenswrapper[4711]: I1203 12:17:37.675281 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42ed72e2-32d3-4845-8659-32d51514dc3c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:38 crc kubenswrapper[4711]: I1203 12:17:38.152406 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"42ed72e2-32d3-4845-8659-32d51514dc3c","Type":"ContainerDied","Data":"9e8155632856525f4e844f78e2ac0472cddd9ca8b11c0a56b525fa0329f4a4d2"} Dec 03 12:17:38 crc kubenswrapper[4711]: I1203 12:17:38.152857 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e8155632856525f4e844f78e2ac0472cddd9ca8b11c0a56b525fa0329f4a4d2" Dec 03 12:17:38 crc kubenswrapper[4711]: I1203 12:17:38.152564 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:17:39 crc kubenswrapper[4711]: I1203 12:17:39.917473 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:39 crc kubenswrapper[4711]: I1203 12:17:39.922290 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-fkpkw" Dec 03 12:17:40 crc kubenswrapper[4711]: I1203 12:17:40.146584 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-x6jjm" Dec 03 12:17:40 crc kubenswrapper[4711]: I1203 12:17:40.281203 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:17:48 crc kubenswrapper[4711]: I1203 12:17:48.899823 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:17:50 crc kubenswrapper[4711]: I1203 12:17:50.596176 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66k2k" Dec 03 12:17:52 crc kubenswrapper[4711]: E1203 12:17:52.503848 4711 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 12:17:52 crc kubenswrapper[4711]: E1203 12:17:52.504062 4711 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lsz8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-vwwvw_openshift-marketplace(1ded0656-834a-4d3d-8cf8-71ed8075e3aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:52 crc kubenswrapper[4711]: E1203 12:17:52.506048 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-vwwvw" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" Dec 03 12:17:52 crc kubenswrapper[4711]: E1203 12:17:52.577264 4711 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 12:17:52 crc kubenswrapper[4711]: E1203 12:17:52.577445 4711 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bt9cs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lx4g4_openshift-marketplace(559a1102-7df3-414e-9a4c-37cd0d35daf1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:52 crc kubenswrapper[4711]: E1203 12:17:52.579060 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lx4g4" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.606269 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 12:17:56 crc kubenswrapper[4711]: E1203 12:17:56.607274 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ed72e2-32d3-4845-8659-32d51514dc3c" containerName="pruner" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.607307 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ed72e2-32d3-4845-8659-32d51514dc3c" containerName="pruner" Dec 03 12:17:56 crc kubenswrapper[4711]: E1203 12:17:56.607352 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5b02f5-74ca-428f-98ef-87a69ceadcb1" containerName="pruner" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.607370 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5b02f5-74ca-428f-98ef-87a69ceadcb1" containerName="pruner" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.607651 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ed72e2-32d3-4845-8659-32d51514dc3c" containerName="pruner" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.607678 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5b02f5-74ca-428f-98ef-87a69ceadcb1" containerName="pruner" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.608420 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.608576 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.626510 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.626763 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.763308 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e1b2403-38fe-4376-971c-5b1334c7adef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5e1b2403-38fe-4376-971c-5b1334c7adef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.763393 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e1b2403-38fe-4376-971c-5b1334c7adef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5e1b2403-38fe-4376-971c-5b1334c7adef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.865063 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e1b2403-38fe-4376-971c-5b1334c7adef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5e1b2403-38fe-4376-971c-5b1334c7adef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.865168 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e1b2403-38fe-4376-971c-5b1334c7adef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5e1b2403-38fe-4376-971c-5b1334c7adef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.865394 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e1b2403-38fe-4376-971c-5b1334c7adef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5e1b2403-38fe-4376-971c-5b1334c7adef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:56 crc kubenswrapper[4711]: I1203 12:17:56.892097 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e1b2403-38fe-4376-971c-5b1334c7adef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5e1b2403-38fe-4376-971c-5b1334c7adef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:57 crc kubenswrapper[4711]: I1203 12:17:57.152259 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:58 crc kubenswrapper[4711]: E1203 12:17:58.284430 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lx4g4" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" Dec 03 12:17:58 crc kubenswrapper[4711]: E1203 12:17:58.285417 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-vwwvw" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" Dec 03 12:17:58 crc kubenswrapper[4711]: E1203 12:17:58.910583 4711 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 12:17:58 crc kubenswrapper[4711]: E1203 12:17:58.910760 4711 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxqdb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7zrh5_openshift-marketplace(fff3e844-ad61-4914-aaff-fdd90e3a9a58): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:58 crc kubenswrapper[4711]: E1203 12:17:58.912376 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7zrh5" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" Dec 03 12:18:00 crc kubenswrapper[4711]: E1203 12:18:00.735515 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7zrh5" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" Dec 03 12:18:00 crc kubenswrapper[4711]: E1203 12:18:00.946852 4711 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 12:18:00 crc kubenswrapper[4711]: E1203 12:18:00.947030 4711 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmwg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mj5ds_openshift-marketplace(3d15af25-55a8-4d30-a83e-3bdf5dc963fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:18:00 crc kubenswrapper[4711]: E1203 12:18:00.948197 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mj5ds" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" Dec 03 12:18:01 crc kubenswrapper[4711]: E1203 12:18:01.001094 4711 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 12:18:01 crc kubenswrapper[4711]: E1203 12:18:01.001246 4711 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v4mgx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rjr4q_openshift-marketplace(0908002c-11e7-4804-a059-dbd19dc6d739): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:18:01 crc kubenswrapper[4711]: E1203 12:18:01.002695 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rjr4q" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.206846 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rjr4q" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.206850 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mj5ds" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.305964 4711 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.306353 4711 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rb65q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5wz5c_openshift-marketplace(61b13e0f-7d37-469f-bf80-3c4b455c34b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.307830 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5wz5c" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.337667 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5wz5c" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.344195 4711 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.344502 4711 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnkbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rvlj8_openshift-marketplace(80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.345689 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rvlj8" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.347579 4711 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.350041 4711 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rsfdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bd87v_openshift-marketplace(25415f1a-641d-41b6-b91c-a48b76a9d465): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:18:02 crc kubenswrapper[4711]: E1203 12:18:02.351320 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bd87v" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.366430 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.367837 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.378395 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.452673 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.452712 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-var-lock\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.452768 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40c22280-fb06-40d1-9599-ce73f2d137d9-kube-api-access\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.553638 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.553769 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-var-lock\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.553722 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.553864 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-var-lock\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.553884 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40c22280-fb06-40d1-9599-ce73f2d137d9-kube-api-access\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.574018 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40c22280-fb06-40d1-9599-ce73f2d137d9-kube-api-access\") pod \"installer-9-crc\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.657325 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 12:18:02 crc kubenswrapper[4711]: W1203 12:18:02.665076 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5e1b2403_38fe_4376_971c_5b1334c7adef.slice/crio-1b771d38a8785b16400f8fe0f03c6977a406ebe713bf752e0d0f519976ce40fa WatchSource:0}: Error finding container 1b771d38a8785b16400f8fe0f03c6977a406ebe713bf752e0d0f519976ce40fa: Status 404 returned error can't find the container with id 1b771d38a8785b16400f8fe0f03c6977a406ebe713bf752e0d0f519976ce40fa Dec 03 12:18:02 crc kubenswrapper[4711]: I1203 12:18:02.708174 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:03 crc kubenswrapper[4711]: I1203 12:18:03.143386 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 12:18:03 crc kubenswrapper[4711]: I1203 12:18:03.340081 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40c22280-fb06-40d1-9599-ce73f2d137d9","Type":"ContainerStarted","Data":"09f930c899fd270d3214664df82125384a9557bba16988bb531d9c902286cf18"} Dec 03 12:18:03 crc kubenswrapper[4711]: I1203 12:18:03.342103 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5e1b2403-38fe-4376-971c-5b1334c7adef","Type":"ContainerStarted","Data":"c12fd4519ad1c8e061a8be96b2a103ff047efcd36a04955602c4d74828884914"} Dec 03 12:18:03 crc kubenswrapper[4711]: I1203 12:18:03.342133 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5e1b2403-38fe-4376-971c-5b1334c7adef","Type":"ContainerStarted","Data":"1b771d38a8785b16400f8fe0f03c6977a406ebe713bf752e0d0f519976ce40fa"} Dec 03 12:18:03 crc kubenswrapper[4711]: E1203 12:18:03.343348 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rvlj8" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" Dec 03 12:18:03 crc kubenswrapper[4711]: E1203 12:18:03.343878 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bd87v" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" Dec 03 12:18:03 crc kubenswrapper[4711]: I1203 12:18:03.409810 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=7.409794368 podStartE2EDuration="7.409794368s" podCreationTimestamp="2025-12-03 12:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:18:03.408652498 +0000 UTC m=+202.077903753" watchObservedRunningTime="2025-12-03 12:18:03.409794368 +0000 UTC m=+202.079045623" Dec 03 12:18:04 crc kubenswrapper[4711]: I1203 12:18:04.351144 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40c22280-fb06-40d1-9599-ce73f2d137d9","Type":"ContainerStarted","Data":"f3e3f882682afa74d456a0b4465ef170691631f67c33d8bb380365aaba1cf4be"} Dec 03 12:18:04 crc kubenswrapper[4711]: I1203 12:18:04.353059 4711 generic.go:334] "Generic (PLEG): container finished" podID="5e1b2403-38fe-4376-971c-5b1334c7adef" containerID="c12fd4519ad1c8e061a8be96b2a103ff047efcd36a04955602c4d74828884914" exitCode=0 Dec 03 12:18:04 crc kubenswrapper[4711]: I1203 12:18:04.353145 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5e1b2403-38fe-4376-971c-5b1334c7adef","Type":"ContainerDied","Data":"c12fd4519ad1c8e061a8be96b2a103ff047efcd36a04955602c4d74828884914"} Dec 03 12:18:04 crc kubenswrapper[4711]: I1203 12:18:04.366463 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.366442611 podStartE2EDuration="2.366442611s" podCreationTimestamp="2025-12-03 12:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:18:04.36266798 +0000 UTC m=+203.031919245" watchObservedRunningTime="2025-12-03 12:18:04.366442611 +0000 UTC m=+203.035693856" Dec 03 12:18:05 crc kubenswrapper[4711]: I1203 12:18:05.403264 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:18:05 crc kubenswrapper[4711]: I1203 12:18:05.404104 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4711]: I1203 12:18:05.404174 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:18:05 crc kubenswrapper[4711]: I1203 12:18:05.406688 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:18:05 crc kubenswrapper[4711]: I1203 12:18:05.406880 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137" gracePeriod=600 Dec 03 12:18:11 crc kubenswrapper[4711]: I1203 12:18:11.407841 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137" exitCode=0 Dec 03 12:18:11 crc kubenswrapper[4711]: I1203 12:18:11.407990 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137"} Dec 03 12:18:13 crc kubenswrapper[4711]: I1203 12:18:13.703769 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:18:13 crc kubenswrapper[4711]: I1203 12:18:13.811670 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e1b2403-38fe-4376-971c-5b1334c7adef-kubelet-dir\") pod \"5e1b2403-38fe-4376-971c-5b1334c7adef\" (UID: \"5e1b2403-38fe-4376-971c-5b1334c7adef\") " Dec 03 12:18:13 crc kubenswrapper[4711]: I1203 12:18:13.811753 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e1b2403-38fe-4376-971c-5b1334c7adef-kube-api-access\") pod \"5e1b2403-38fe-4376-971c-5b1334c7adef\" (UID: \"5e1b2403-38fe-4376-971c-5b1334c7adef\") " Dec 03 12:18:13 crc kubenswrapper[4711]: I1203 12:18:13.812781 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e1b2403-38fe-4376-971c-5b1334c7adef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5e1b2403-38fe-4376-971c-5b1334c7adef" (UID: "5e1b2403-38fe-4376-971c-5b1334c7adef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:13 crc kubenswrapper[4711]: I1203 12:18:13.818237 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1b2403-38fe-4376-971c-5b1334c7adef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5e1b2403-38fe-4376-971c-5b1334c7adef" (UID: "5e1b2403-38fe-4376-971c-5b1334c7adef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:18:13 crc kubenswrapper[4711]: I1203 12:18:13.912685 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e1b2403-38fe-4376-971c-5b1334c7adef-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:13 crc kubenswrapper[4711]: I1203 12:18:13.912729 4711 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5e1b2403-38fe-4376-971c-5b1334c7adef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:14 crc kubenswrapper[4711]: I1203 12:18:14.453773 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"439acc8b24a33cb172441cb0bf7aad33e2c32df929a12c484a27ad537b2e6d97"} Dec 03 12:18:14 crc kubenswrapper[4711]: I1203 12:18:14.455657 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5e1b2403-38fe-4376-971c-5b1334c7adef","Type":"ContainerDied","Data":"1b771d38a8785b16400f8fe0f03c6977a406ebe713bf752e0d0f519976ce40fa"} Dec 03 12:18:14 crc kubenswrapper[4711]: I1203 12:18:14.455693 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b771d38a8785b16400f8fe0f03c6977a406ebe713bf752e0d0f519976ce40fa" Dec 03 12:18:14 crc kubenswrapper[4711]: I1203 12:18:14.455761 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:18:15 crc kubenswrapper[4711]: I1203 12:18:15.463619 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwvw" event={"ID":"1ded0656-834a-4d3d-8cf8-71ed8075e3aa","Type":"ContainerStarted","Data":"6faa9e49e23b1c5f32fcf8276fca10f001407b3850dd548e835d40ffcf846b3b"} Dec 03 12:18:15 crc kubenswrapper[4711]: I1203 12:18:15.466640 4711 generic.go:334] "Generic (PLEG): container finished" podID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerID="4302a43a8af92f42293423aff1e26f41a00379aac4d169393b0f8e9fd3fabbd2" exitCode=0 Dec 03 12:18:15 crc kubenswrapper[4711]: I1203 12:18:15.466700 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvlj8" event={"ID":"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02","Type":"ContainerDied","Data":"4302a43a8af92f42293423aff1e26f41a00379aac4d169393b0f8e9fd3fabbd2"} Dec 03 12:18:15 crc kubenswrapper[4711]: I1203 12:18:15.470682 4711 generic.go:334] "Generic (PLEG): container finished" podID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerID="480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38" exitCode=0 Dec 03 12:18:15 crc kubenswrapper[4711]: I1203 12:18:15.470738 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4g4" event={"ID":"559a1102-7df3-414e-9a4c-37cd0d35daf1","Type":"ContainerDied","Data":"480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38"} Dec 03 12:18:15 crc kubenswrapper[4711]: I1203 12:18:15.473178 4711 generic.go:334] "Generic (PLEG): container finished" podID="0908002c-11e7-4804-a059-dbd19dc6d739" containerID="791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3" exitCode=0 Dec 03 12:18:15 crc kubenswrapper[4711]: I1203 12:18:15.473262 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjr4q" event={"ID":"0908002c-11e7-4804-a059-dbd19dc6d739","Type":"ContainerDied","Data":"791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3"} Dec 03 12:18:16 crc kubenswrapper[4711]: I1203 12:18:16.482697 4711 generic.go:334] "Generic (PLEG): container finished" podID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerID="6faa9e49e23b1c5f32fcf8276fca10f001407b3850dd548e835d40ffcf846b3b" exitCode=0 Dec 03 12:18:16 crc kubenswrapper[4711]: I1203 12:18:16.483332 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwvw" event={"ID":"1ded0656-834a-4d3d-8cf8-71ed8075e3aa","Type":"ContainerDied","Data":"6faa9e49e23b1c5f32fcf8276fca10f001407b3850dd548e835d40ffcf846b3b"} Dec 03 12:18:16 crc kubenswrapper[4711]: I1203 12:18:16.486632 4711 generic.go:334] "Generic (PLEG): container finished" podID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerID="f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d" exitCode=0 Dec 03 12:18:16 crc kubenswrapper[4711]: I1203 12:18:16.486696 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zrh5" event={"ID":"fff3e844-ad61-4914-aaff-fdd90e3a9a58","Type":"ContainerDied","Data":"f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d"} Dec 03 12:18:17 crc kubenswrapper[4711]: I1203 12:18:17.541995 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvlj8" event={"ID":"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02","Type":"ContainerStarted","Data":"9b226c9c8a105989c5aba1ea8fe9976f8228dae66080c85c6440ea1bfc08bd0a"} Dec 03 12:18:17 crc kubenswrapper[4711]: I1203 12:18:17.548824 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4g4" event={"ID":"559a1102-7df3-414e-9a4c-37cd0d35daf1","Type":"ContainerStarted","Data":"956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27"} Dec 03 12:18:17 crc kubenswrapper[4711]: I1203 12:18:17.578343 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjr4q" event={"ID":"0908002c-11e7-4804-a059-dbd19dc6d739","Type":"ContainerStarted","Data":"54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e"} Dec 03 12:18:17 crc kubenswrapper[4711]: I1203 12:18:17.578383 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rvlj8" podStartSLOduration=3.6576400959999997 podStartE2EDuration="1m2.578371041s" podCreationTimestamp="2025-12-03 12:17:15 +0000 UTC" firstStartedPulling="2025-12-03 12:17:18.380079966 +0000 UTC m=+157.049331221" lastFinishedPulling="2025-12-03 12:18:17.300810911 +0000 UTC m=+215.970062166" observedRunningTime="2025-12-03 12:18:17.578117865 +0000 UTC m=+216.247369140" watchObservedRunningTime="2025-12-03 12:18:17.578371041 +0000 UTC m=+216.247622296" Dec 03 12:18:17 crc kubenswrapper[4711]: I1203 12:18:17.635000 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lx4g4" podStartSLOduration=3.247134376 podStartE2EDuration="59.634983424s" podCreationTimestamp="2025-12-03 12:17:18 +0000 UTC" firstStartedPulling="2025-12-03 12:17:20.616632094 +0000 UTC m=+159.285883349" lastFinishedPulling="2025-12-03 12:18:17.004481142 +0000 UTC m=+215.673732397" observedRunningTime="2025-12-03 12:18:17.608898102 +0000 UTC m=+216.278149377" watchObservedRunningTime="2025-12-03 12:18:17.634983424 +0000 UTC m=+216.304234679" Dec 03 12:18:17 crc kubenswrapper[4711]: I1203 12:18:17.829305 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:18:17 crc kubenswrapper[4711]: I1203 12:18:17.829351 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.589762 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zrh5" event={"ID":"fff3e844-ad61-4914-aaff-fdd90e3a9a58","Type":"ContainerStarted","Data":"7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3"} Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.599160 4711 generic.go:334] "Generic (PLEG): container finished" podID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerID="3abbc733c68c32db5f21ad99ed0cd52f52f76505b7be5ef4ad7fcb3360c360f0" exitCode=0 Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.599232 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj5ds" event={"ID":"3d15af25-55a8-4d30-a83e-3bdf5dc963fc","Type":"ContainerDied","Data":"3abbc733c68c32db5f21ad99ed0cd52f52f76505b7be5ef4ad7fcb3360c360f0"} Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.602460 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwvw" event={"ID":"1ded0656-834a-4d3d-8cf8-71ed8075e3aa","Type":"ContainerStarted","Data":"57f461ce796465836aa539c0afff8095c77396ca0bcf2895af2578aaa724074e"} Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.607350 4711 generic.go:334] "Generic (PLEG): container finished" podID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerID="0a1252ad33d3785c712cd560aff6586d7f3cdf28cce3d46d061e8cf02cbe7427" exitCode=0 Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.607432 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd87v" event={"ID":"25415f1a-641d-41b6-b91c-a48b76a9d465","Type":"ContainerDied","Data":"0a1252ad33d3785c712cd560aff6586d7f3cdf28cce3d46d061e8cf02cbe7427"} Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.609557 4711 generic.go:334] "Generic (PLEG): container finished" podID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerID="43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf" exitCode=0 Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.609981 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wz5c" event={"ID":"61b13e0f-7d37-469f-bf80-3c4b455c34b0","Type":"ContainerDied","Data":"43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf"} Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.619870 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rjr4q" podStartSLOduration=5.294958819 podStartE2EDuration="1m1.619855686s" podCreationTimestamp="2025-12-03 12:17:17 +0000 UTC" firstStartedPulling="2025-12-03 12:17:20.640350684 +0000 UTC m=+159.309601939" lastFinishedPulling="2025-12-03 12:18:16.965247551 +0000 UTC m=+215.634498806" observedRunningTime="2025-12-03 12:18:17.632232431 +0000 UTC m=+216.301483696" watchObservedRunningTime="2025-12-03 12:18:18.619855686 +0000 UTC m=+217.289106941" Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.642180 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zrh5" podStartSLOduration=4.6465319449999996 podStartE2EDuration="1m3.642162209s" podCreationTimestamp="2025-12-03 12:17:15 +0000 UTC" firstStartedPulling="2025-12-03 12:17:18.351086416 +0000 UTC m=+157.020337671" lastFinishedPulling="2025-12-03 12:18:17.34671668 +0000 UTC m=+216.015967935" observedRunningTime="2025-12-03 12:18:18.621450888 +0000 UTC m=+217.290702173" watchObservedRunningTime="2025-12-03 12:18:18.642162209 +0000 UTC m=+217.311413464" Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.663171 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vwwvw" podStartSLOduration=3.79220459 podStartE2EDuration="1m0.663147346s" podCreationTimestamp="2025-12-03 12:17:18 +0000 UTC" firstStartedPulling="2025-12-03 12:17:20.589899804 +0000 UTC m=+159.259151059" lastFinishedPulling="2025-12-03 12:18:17.46084257 +0000 UTC m=+216.130093815" observedRunningTime="2025-12-03 12:18:18.660055264 +0000 UTC m=+217.329306529" watchObservedRunningTime="2025-12-03 12:18:18.663147346 +0000 UTC m=+217.332398611" Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.761749 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.761804 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:18:18 crc kubenswrapper[4711]: I1203 12:18:18.951968 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rjr4q" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="registry-server" probeResult="failure" output=< Dec 03 12:18:18 crc kubenswrapper[4711]: timeout: failed to connect service ":50051" within 1s Dec 03 12:18:18 crc kubenswrapper[4711]: > Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.204055 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.204117 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.617580 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj5ds" event={"ID":"3d15af25-55a8-4d30-a83e-3bdf5dc963fc","Type":"ContainerStarted","Data":"82a2b77c9b50ff2b51c334b9bc5cd4a5c11c18fe1af29cf6d89de0688bd7cd44"} Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.622245 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd87v" event={"ID":"25415f1a-641d-41b6-b91c-a48b76a9d465","Type":"ContainerStarted","Data":"f128a2a5c30e840833f849ed31d2a0bbdab35dfadfc6c83dca08952cd687480f"} Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.625030 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wz5c" event={"ID":"61b13e0f-7d37-469f-bf80-3c4b455c34b0","Type":"ContainerStarted","Data":"06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54"} Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.642014 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mj5ds" podStartSLOduration=4.245096362 podStartE2EDuration="1m2.641993968s" podCreationTimestamp="2025-12-03 12:17:17 +0000 UTC" firstStartedPulling="2025-12-03 12:17:20.615480403 +0000 UTC m=+159.284731658" lastFinishedPulling="2025-12-03 12:18:19.012378009 +0000 UTC m=+217.681629264" observedRunningTime="2025-12-03 12:18:19.637436997 +0000 UTC m=+218.306688262" watchObservedRunningTime="2025-12-03 12:18:19.641993968 +0000 UTC m=+218.311245223" Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.682217 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5wz5c" podStartSLOduration=4.043448571 podStartE2EDuration="1m4.682193895s" podCreationTimestamp="2025-12-03 12:17:15 +0000 UTC" firstStartedPulling="2025-12-03 12:17:18.383134697 +0000 UTC m=+157.052385952" lastFinishedPulling="2025-12-03 12:18:19.021880021 +0000 UTC m=+217.691131276" observedRunningTime="2025-12-03 12:18:19.680453618 +0000 UTC m=+218.349704893" watchObservedRunningTime="2025-12-03 12:18:19.682193895 +0000 UTC m=+218.351445150" Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.703426 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bd87v" podStartSLOduration=3.846306596 podStartE2EDuration="1m4.703408488s" podCreationTimestamp="2025-12-03 12:17:15 +0000 UTC" firstStartedPulling="2025-12-03 12:17:18.361941725 +0000 UTC m=+157.031192980" lastFinishedPulling="2025-12-03 12:18:19.219043617 +0000 UTC m=+217.888294872" observedRunningTime="2025-12-03 12:18:19.701506228 +0000 UTC m=+218.370757503" watchObservedRunningTime="2025-12-03 12:18:19.703408488 +0000 UTC m=+218.372659743" Dec 03 12:18:19 crc kubenswrapper[4711]: I1203 12:18:19.808432 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lx4g4" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="registry-server" probeResult="failure" output=< Dec 03 12:18:19 crc kubenswrapper[4711]: timeout: failed to connect service ":50051" within 1s Dec 03 12:18:19 crc kubenswrapper[4711]: > Dec 03 12:18:20 crc kubenswrapper[4711]: I1203 12:18:20.253824 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vwwvw" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="registry-server" probeResult="failure" output=< Dec 03 12:18:20 crc kubenswrapper[4711]: timeout: failed to connect service ":50051" within 1s Dec 03 12:18:20 crc kubenswrapper[4711]: > Dec 03 12:18:25 crc kubenswrapper[4711]: I1203 12:18:25.610407 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:18:25 crc kubenswrapper[4711]: I1203 12:18:25.610693 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:18:25 crc kubenswrapper[4711]: I1203 12:18:25.773873 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:18:25 crc kubenswrapper[4711]: I1203 12:18:25.773982 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.026267 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.026429 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.137481 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.137556 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.435791 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.437598 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.438905 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.439342 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.478745 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.752089 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.850701 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:18:26 crc kubenswrapper[4711]: I1203 12:18:26.861795 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:18:27 crc kubenswrapper[4711]: I1203 12:18:27.878493 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:18:27 crc kubenswrapper[4711]: I1203 12:18:27.927864 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.137956 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.139284 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.180639 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.678597 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bd87v"] Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.720643 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bd87v" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerName="registry-server" containerID="cri-o://f128a2a5c30e840833f849ed31d2a0bbdab35dfadfc6c83dca08952cd687480f" gracePeriod=2 Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.761258 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.808741 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.863731 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.877070 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rvlj8"] Dec 03 12:18:28 crc kubenswrapper[4711]: I1203 12:18:28.877324 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rvlj8" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerName="registry-server" containerID="cri-o://9b226c9c8a105989c5aba1ea8fe9976f8228dae66080c85c6440ea1bfc08bd0a" gracePeriod=2 Dec 03 12:18:29 crc kubenswrapper[4711]: I1203 12:18:29.239820 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:18:29 crc kubenswrapper[4711]: I1203 12:18:29.280839 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:18:31 crc kubenswrapper[4711]: I1203 12:18:31.081504 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj5ds"] Dec 03 12:18:31 crc kubenswrapper[4711]: I1203 12:18:31.081886 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mj5ds" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerName="registry-server" containerID="cri-o://82a2b77c9b50ff2b51c334b9bc5cd4a5c11c18fe1af29cf6d89de0688bd7cd44" gracePeriod=2 Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.273632 4711 generic.go:334] "Generic (PLEG): container finished" podID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerID="9b226c9c8a105989c5aba1ea8fe9976f8228dae66080c85c6440ea1bfc08bd0a" exitCode=0 Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.273762 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvlj8" event={"ID":"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02","Type":"ContainerDied","Data":"9b226c9c8a105989c5aba1ea8fe9976f8228dae66080c85c6440ea1bfc08bd0a"} Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.282649 4711 generic.go:334] "Generic (PLEG): container finished" podID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerID="f128a2a5c30e840833f849ed31d2a0bbdab35dfadfc6c83dca08952cd687480f" exitCode=0 Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.282694 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd87v" event={"ID":"25415f1a-641d-41b6-b91c-a48b76a9d465","Type":"ContainerDied","Data":"f128a2a5c30e840833f849ed31d2a0bbdab35dfadfc6c83dca08952cd687480f"} Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.482233 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwwvw"] Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.482605 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vwwvw" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="registry-server" containerID="cri-o://57f461ce796465836aa539c0afff8095c77396ca0bcf2895af2578aaa724074e" gracePeriod=2 Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.524943 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.536006 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.586652 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-catalog-content\") pod \"25415f1a-641d-41b6-b91c-a48b76a9d465\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.586738 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-utilities\") pod \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.586778 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-utilities\") pod \"25415f1a-641d-41b6-b91c-a48b76a9d465\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.587005 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsfdv\" (UniqueName: \"kubernetes.io/projected/25415f1a-641d-41b6-b91c-a48b76a9d465-kube-api-access-rsfdv\") pod \"25415f1a-641d-41b6-b91c-a48b76a9d465\" (UID: \"25415f1a-641d-41b6-b91c-a48b76a9d465\") " Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.587033 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnkbr\" (UniqueName: \"kubernetes.io/projected/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-kube-api-access-gnkbr\") pod \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.587056 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-catalog-content\") pod \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\" (UID: \"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02\") " Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.587744 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-utilities" (OuterVolumeSpecName: "utilities") pod "80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" (UID: "80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.588490 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-utilities" (OuterVolumeSpecName: "utilities") pod "25415f1a-641d-41b6-b91c-a48b76a9d465" (UID: "25415f1a-641d-41b6-b91c-a48b76a9d465"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.594849 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25415f1a-641d-41b6-b91c-a48b76a9d465-kube-api-access-rsfdv" (OuterVolumeSpecName: "kube-api-access-rsfdv") pod "25415f1a-641d-41b6-b91c-a48b76a9d465" (UID: "25415f1a-641d-41b6-b91c-a48b76a9d465"). InnerVolumeSpecName "kube-api-access-rsfdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.596889 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-kube-api-access-gnkbr" (OuterVolumeSpecName: "kube-api-access-gnkbr") pod "80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" (UID: "80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02"). InnerVolumeSpecName "kube-api-access-gnkbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.641521 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" (UID: "80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.688209 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsfdv\" (UniqueName: \"kubernetes.io/projected/25415f1a-641d-41b6-b91c-a48b76a9d465-kube-api-access-rsfdv\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.688552 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnkbr\" (UniqueName: \"kubernetes.io/projected/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-kube-api-access-gnkbr\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.688564 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.688575 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:33 crc kubenswrapper[4711]: I1203 12:18:33.688586 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.315146 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd87v" event={"ID":"25415f1a-641d-41b6-b91c-a48b76a9d465","Type":"ContainerDied","Data":"3e534c9314fe4453cd8e0bb3ef4ce06ce82a5474ac3031c4ea5504af743a47bc"} Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.315212 4711 scope.go:117] "RemoveContainer" containerID="f128a2a5c30e840833f849ed31d2a0bbdab35dfadfc6c83dca08952cd687480f" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.315217 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd87v" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.319736 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvlj8" event={"ID":"80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02","Type":"ContainerDied","Data":"120bd45357dcacfdb2c5383883983a569e1ef2850f52e1e26f71cb7465dc539f"} Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.319800 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvlj8" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.323427 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mj5ds_3d15af25-55a8-4d30-a83e-3bdf5dc963fc/registry-server/0.log" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.325312 4711 generic.go:334] "Generic (PLEG): container finished" podID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerID="82a2b77c9b50ff2b51c334b9bc5cd4a5c11c18fe1af29cf6d89de0688bd7cd44" exitCode=137 Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.325394 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj5ds" event={"ID":"3d15af25-55a8-4d30-a83e-3bdf5dc963fc","Type":"ContainerDied","Data":"82a2b77c9b50ff2b51c334b9bc5cd4a5c11c18fe1af29cf6d89de0688bd7cd44"} Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.348975 4711 scope.go:117] "RemoveContainer" containerID="0a1252ad33d3785c712cd560aff6586d7f3cdf28cce3d46d061e8cf02cbe7427" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.352092 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rvlj8"] Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.356381 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rvlj8"] Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.371111 4711 scope.go:117] "RemoveContainer" containerID="462878b8f0b07442429d2b67848e8d18efc160e0a73a517d30d1a84a99fe222e" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.446983 4711 scope.go:117] "RemoveContainer" containerID="9b226c9c8a105989c5aba1ea8fe9976f8228dae66080c85c6440ea1bfc08bd0a" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.470087 4711 scope.go:117] "RemoveContainer" containerID="4302a43a8af92f42293423aff1e26f41a00379aac4d169393b0f8e9fd3fabbd2" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.491926 4711 scope.go:117] "RemoveContainer" containerID="e708d292c30e2c36c131fce0bb662701bb89b7c5d718099e9882164e03dfe08f" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.541858 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25415f1a-641d-41b6-b91c-a48b76a9d465" (UID: "25415f1a-641d-41b6-b91c-a48b76a9d465"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.601113 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25415f1a-641d-41b6-b91c-a48b76a9d465-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.641487 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bd87v"] Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.645195 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bd87v"] Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.668625 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mj5ds_3d15af25-55a8-4d30-a83e-3bdf5dc963fc/registry-server/0.log" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.669333 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.807414 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmwg5\" (UniqueName: \"kubernetes.io/projected/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-kube-api-access-hmwg5\") pod \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.807499 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-catalog-content\") pod \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.807584 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-utilities\") pod \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\" (UID: \"3d15af25-55a8-4d30-a83e-3bdf5dc963fc\") " Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.808305 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-utilities" (OuterVolumeSpecName: "utilities") pod "3d15af25-55a8-4d30-a83e-3bdf5dc963fc" (UID: "3d15af25-55a8-4d30-a83e-3bdf5dc963fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.813367 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-kube-api-access-hmwg5" (OuterVolumeSpecName: "kube-api-access-hmwg5") pod "3d15af25-55a8-4d30-a83e-3bdf5dc963fc" (UID: "3d15af25-55a8-4d30-a83e-3bdf5dc963fc"). InnerVolumeSpecName "kube-api-access-hmwg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.826718 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d15af25-55a8-4d30-a83e-3bdf5dc963fc" (UID: "3d15af25-55a8-4d30-a83e-3bdf5dc963fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.909059 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmwg5\" (UniqueName: \"kubernetes.io/projected/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-kube-api-access-hmwg5\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.909117 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:34 crc kubenswrapper[4711]: I1203 12:18:34.909153 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d15af25-55a8-4d30-a83e-3bdf5dc963fc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.338609 4711 generic.go:334] "Generic (PLEG): container finished" podID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerID="57f461ce796465836aa539c0afff8095c77396ca0bcf2895af2578aaa724074e" exitCode=0 Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.339065 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwvw" event={"ID":"1ded0656-834a-4d3d-8cf8-71ed8075e3aa","Type":"ContainerDied","Data":"57f461ce796465836aa539c0afff8095c77396ca0bcf2895af2578aaa724074e"} Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.346290 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mj5ds_3d15af25-55a8-4d30-a83e-3bdf5dc963fc/registry-server/0.log" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.346828 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj5ds" event={"ID":"3d15af25-55a8-4d30-a83e-3bdf5dc963fc","Type":"ContainerDied","Data":"3fb60af4000c47fdfd0d63ae6495519e3d6be6b54c380c075d789e2dcc034d98"} Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.346871 4711 scope.go:117] "RemoveContainer" containerID="82a2b77c9b50ff2b51c334b9bc5cd4a5c11c18fe1af29cf6d89de0688bd7cd44" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.347017 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mj5ds" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.385652 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj5ds"] Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.388876 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj5ds"] Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.390052 4711 scope.go:117] "RemoveContainer" containerID="3abbc733c68c32db5f21ad99ed0cd52f52f76505b7be5ef4ad7fcb3360c360f0" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.416887 4711 scope.go:117] "RemoveContainer" containerID="0202e5e6d5b30133473dd8129a6116bf4cb5d7e9a43b6cf4e6f47591b11e51e6" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.482597 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.618696 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsz8v\" (UniqueName: \"kubernetes.io/projected/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-kube-api-access-lsz8v\") pod \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.618786 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-utilities\") pod \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.618829 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-catalog-content\") pod \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\" (UID: \"1ded0656-834a-4d3d-8cf8-71ed8075e3aa\") " Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.619712 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-utilities" (OuterVolumeSpecName: "utilities") pod "1ded0656-834a-4d3d-8cf8-71ed8075e3aa" (UID: "1ded0656-834a-4d3d-8cf8-71ed8075e3aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.626874 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-kube-api-access-lsz8v" (OuterVolumeSpecName: "kube-api-access-lsz8v") pod "1ded0656-834a-4d3d-8cf8-71ed8075e3aa" (UID: "1ded0656-834a-4d3d-8cf8-71ed8075e3aa"). InnerVolumeSpecName "kube-api-access-lsz8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.720730 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsz8v\" (UniqueName: \"kubernetes.io/projected/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-kube-api-access-lsz8v\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.720770 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.747510 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ded0656-834a-4d3d-8cf8-71ed8075e3aa" (UID: "1ded0656-834a-4d3d-8cf8-71ed8075e3aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.821356 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ded0656-834a-4d3d-8cf8-71ed8075e3aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.824242 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" path="/var/lib/kubelet/pods/25415f1a-641d-41b6-b91c-a48b76a9d465/volumes" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.824937 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" path="/var/lib/kubelet/pods/3d15af25-55a8-4d30-a83e-3bdf5dc963fc/volumes" Dec 03 12:18:35 crc kubenswrapper[4711]: I1203 12:18:35.825595 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" path="/var/lib/kubelet/pods/80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02/volumes" Dec 03 12:18:36 crc kubenswrapper[4711]: I1203 12:18:36.354236 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwvw" event={"ID":"1ded0656-834a-4d3d-8cf8-71ed8075e3aa","Type":"ContainerDied","Data":"acffa7977af31e9a837422ff0ccd2777a2610e3c34d90c11511a890c9835692c"} Dec 03 12:18:36 crc kubenswrapper[4711]: I1203 12:18:36.354272 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwwvw" Dec 03 12:18:36 crc kubenswrapper[4711]: I1203 12:18:36.354293 4711 scope.go:117] "RemoveContainer" containerID="57f461ce796465836aa539c0afff8095c77396ca0bcf2895af2578aaa724074e" Dec 03 12:18:36 crc kubenswrapper[4711]: I1203 12:18:36.370422 4711 scope.go:117] "RemoveContainer" containerID="6faa9e49e23b1c5f32fcf8276fca10f001407b3850dd548e835d40ffcf846b3b" Dec 03 12:18:36 crc kubenswrapper[4711]: I1203 12:18:36.372455 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwwvw"] Dec 03 12:18:36 crc kubenswrapper[4711]: I1203 12:18:36.377788 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vwwvw"] Dec 03 12:18:36 crc kubenswrapper[4711]: I1203 12:18:36.385279 4711 scope.go:117] "RemoveContainer" containerID="2cc7e6508d69dc7d3ca5a7984c4ee779f6f7609455b419da712739d2403b8c46" Dec 03 12:18:37 crc kubenswrapper[4711]: I1203 12:18:37.825723 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" path="/var/lib/kubelet/pods/1ded0656-834a-4d3d-8cf8-71ed8075e3aa/volumes" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.475643 4711 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476367 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerName="extract-content" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476391 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerName="extract-content" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476411 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerName="extract-utilities" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476423 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerName="extract-utilities" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476438 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="extract-content" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476451 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="extract-content" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476468 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerName="extract-utilities" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476478 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerName="extract-utilities" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476495 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1b2403-38fe-4376-971c-5b1334c7adef" containerName="pruner" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476505 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1b2403-38fe-4376-971c-5b1334c7adef" containerName="pruner" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476520 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476530 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476547 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerName="extract-utilities" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476559 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerName="extract-utilities" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476572 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476582 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476594 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerName="extract-content" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476604 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerName="extract-content" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476622 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="extract-utilities" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476631 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="extract-utilities" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476643 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476654 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476670 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476679 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.476690 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerName="extract-content" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476700 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerName="extract-content" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476853 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ded0656-834a-4d3d-8cf8-71ed8075e3aa" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476875 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="25415f1a-641d-41b6-b91c-a48b76a9d465" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476889 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d15af25-55a8-4d30-a83e-3bdf5dc963fc" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476903 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b0d8c0-caf8-4d7e-8cd8-efe6677e4e02" containerName="registry-server" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.476940 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1b2403-38fe-4376-971c-5b1334c7adef" containerName="pruner" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477417 4711 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477455 4711 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.477628 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477642 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.477660 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477670 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.477687 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477697 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477707 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.477713 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477877 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.477894 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477964 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.477978 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.477989 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 12:18:41 crc kubenswrapper[4711]: E1203 12:18:41.478000 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478010 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478173 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478193 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478206 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478229 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478241 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478252 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478689 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0" gracePeriod=15 Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478790 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0" gracePeriod=15 Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478815 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c" gracePeriod=15 Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478807 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84" gracePeriod=15 Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.478833 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce" gracePeriod=15 Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.486878 4711 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.595471 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.595802 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.595829 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.595855 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.595874 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.595898 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.595941 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.595959 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697277 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697338 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697362 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697393 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697415 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697423 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697482 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697487 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697440 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697512 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697532 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697538 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697592 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697658 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697681 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:41 crc kubenswrapper[4711]: I1203 12:18:41.697759 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.395453 4711 generic.go:334] "Generic (PLEG): container finished" podID="40c22280-fb06-40d1-9599-ce73f2d137d9" containerID="f3e3f882682afa74d456a0b4465ef170691631f67c33d8bb380365aaba1cf4be" exitCode=0 Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.395529 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40c22280-fb06-40d1-9599-ce73f2d137d9","Type":"ContainerDied","Data":"f3e3f882682afa74d456a0b4465ef170691631f67c33d8bb380365aaba1cf4be"} Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.396781 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.397786 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.398974 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.399568 4711 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c" exitCode=0 Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.399590 4711 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0" exitCode=0 Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.399597 4711 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84" exitCode=0 Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.399606 4711 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce" exitCode=2 Dec 03 12:18:42 crc kubenswrapper[4711]: I1203 12:18:42.399668 4711 scope.go:117] "RemoveContainer" containerID="c415b5c855c486279fbdd0b0ef3119f288f0e7804a4a08de3d955e414c4e1054" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.407581 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.669076 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.670389 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.722820 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-kubelet-dir\") pod \"40c22280-fb06-40d1-9599-ce73f2d137d9\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.722950 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40c22280-fb06-40d1-9599-ce73f2d137d9-kube-api-access\") pod \"40c22280-fb06-40d1-9599-ce73f2d137d9\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.722990 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-var-lock\") pod \"40c22280-fb06-40d1-9599-ce73f2d137d9\" (UID: \"40c22280-fb06-40d1-9599-ce73f2d137d9\") " Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.723047 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "40c22280-fb06-40d1-9599-ce73f2d137d9" (UID: "40c22280-fb06-40d1-9599-ce73f2d137d9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.723121 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-var-lock" (OuterVolumeSpecName: "var-lock") pod "40c22280-fb06-40d1-9599-ce73f2d137d9" (UID: "40c22280-fb06-40d1-9599-ce73f2d137d9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.723318 4711 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.723339 4711 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40c22280-fb06-40d1-9599-ce73f2d137d9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.728548 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c22280-fb06-40d1-9599-ce73f2d137d9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "40c22280-fb06-40d1-9599-ce73f2d137d9" (UID: "40c22280-fb06-40d1-9599-ce73f2d137d9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.823985 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40c22280-fb06-40d1-9599-ce73f2d137d9-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.871265 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.871995 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.872550 4711 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:43 crc kubenswrapper[4711]: I1203 12:18:43.873031 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026251 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026332 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026354 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026459 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026521 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026628 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026805 4711 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026824 4711 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.026838 4711 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.417201 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40c22280-fb06-40d1-9599-ce73f2d137d9","Type":"ContainerDied","Data":"09f930c899fd270d3214664df82125384a9557bba16988bb531d9c902286cf18"} Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.417247 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09f930c899fd270d3214664df82125384a9557bba16988bb531d9c902286cf18" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.417339 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.420355 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.421158 4711 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0" exitCode=0 Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.421247 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.421291 4711 scope.go:117] "RemoveContainer" containerID="ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.422303 4711 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.422707 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.433805 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.434145 4711 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.437263 4711 scope.go:117] "RemoveContainer" containerID="b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.450140 4711 scope.go:117] "RemoveContainer" containerID="19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.465127 4711 scope.go:117] "RemoveContainer" containerID="2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.480893 4711 scope.go:117] "RemoveContainer" containerID="902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.495892 4711 scope.go:117] "RemoveContainer" containerID="4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.515798 4711 scope.go:117] "RemoveContainer" containerID="ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c" Dec 03 12:18:44 crc kubenswrapper[4711]: E1203 12:18:44.516421 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\": container with ID starting with ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c not found: ID does not exist" containerID="ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.516461 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c"} err="failed to get container status \"ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\": rpc error: code = NotFound desc = could not find container \"ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c\": container with ID starting with ff1429975613fae592161a342d378955fdee9e2e80373c12c2ff163cba52c39c not found: ID does not exist" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.516493 4711 scope.go:117] "RemoveContainer" containerID="b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0" Dec 03 12:18:44 crc kubenswrapper[4711]: E1203 12:18:44.516886 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\": container with ID starting with b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0 not found: ID does not exist" containerID="b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.516943 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0"} err="failed to get container status \"b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\": rpc error: code = NotFound desc = could not find container \"b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0\": container with ID starting with b4fff9446eedc5a4c916fd3e7b9df9bab38ee3d37352cad515cb40cac6adbfd0 not found: ID does not exist" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.516972 4711 scope.go:117] "RemoveContainer" containerID="19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84" Dec 03 12:18:44 crc kubenswrapper[4711]: E1203 12:18:44.517326 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\": container with ID starting with 19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84 not found: ID does not exist" containerID="19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.517363 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84"} err="failed to get container status \"19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\": rpc error: code = NotFound desc = could not find container \"19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84\": container with ID starting with 19d24d810091fc3f824ebe0566ac4177deeda83ac06c56bd0f5d1dbbe944ae84 not found: ID does not exist" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.517390 4711 scope.go:117] "RemoveContainer" containerID="2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce" Dec 03 12:18:44 crc kubenswrapper[4711]: E1203 12:18:44.517672 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\": container with ID starting with 2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce not found: ID does not exist" containerID="2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.517721 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce"} err="failed to get container status \"2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\": rpc error: code = NotFound desc = could not find container \"2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce\": container with ID starting with 2d7f13d2946e36fac6d95102c195316722fd940a94f9d9cde8c362fb6422d1ce not found: ID does not exist" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.517755 4711 scope.go:117] "RemoveContainer" containerID="902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0" Dec 03 12:18:44 crc kubenswrapper[4711]: E1203 12:18:44.518062 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\": container with ID starting with 902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0 not found: ID does not exist" containerID="902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.518091 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0"} err="failed to get container status \"902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\": rpc error: code = NotFound desc = could not find container \"902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0\": container with ID starting with 902d15eab43b0d9db360449eb7c9e61144c4328b74c35a5257e05bfda24a19a0 not found: ID does not exist" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.518107 4711 scope.go:117] "RemoveContainer" containerID="4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948" Dec 03 12:18:44 crc kubenswrapper[4711]: E1203 12:18:44.518453 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\": container with ID starting with 4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948 not found: ID does not exist" containerID="4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948" Dec 03 12:18:44 crc kubenswrapper[4711]: I1203 12:18:44.518488 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948"} err="failed to get container status \"4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\": rpc error: code = NotFound desc = could not find container \"4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948\": container with ID starting with 4f4cc04b8baea58a0fdcd83259fa4e6f92b58210713b70082fcfd17da952c948 not found: ID does not exist" Dec 03 12:18:45 crc kubenswrapper[4711]: E1203 12:18:45.270473 4711 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:45 crc kubenswrapper[4711]: E1203 12:18:45.270797 4711 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:45 crc kubenswrapper[4711]: E1203 12:18:45.271256 4711 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:45 crc kubenswrapper[4711]: E1203 12:18:45.271588 4711 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:45 crc kubenswrapper[4711]: E1203 12:18:45.271863 4711 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:45 crc kubenswrapper[4711]: I1203 12:18:45.271898 4711 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 12:18:45 crc kubenswrapper[4711]: E1203 12:18:45.272157 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Dec 03 12:18:45 crc kubenswrapper[4711]: E1203 12:18:45.473585 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Dec 03 12:18:45 crc kubenswrapper[4711]: I1203 12:18:45.823467 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 12:18:45 crc kubenswrapper[4711]: E1203 12:18:45.875229 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Dec 03 12:18:46 crc kubenswrapper[4711]: E1203 12:18:46.511338 4711 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:46 crc kubenswrapper[4711]: I1203 12:18:46.511871 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:46 crc kubenswrapper[4711]: E1203 12:18:46.566543 4711 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187db3ce0d3763d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 12:18:46.565471192 +0000 UTC m=+245.234722487,LastTimestamp:2025-12-03 12:18:46.565471192 +0000 UTC m=+245.234722487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 12:18:46 crc kubenswrapper[4711]: E1203 12:18:46.676449 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Dec 03 12:18:47 crc kubenswrapper[4711]: I1203 12:18:47.451222 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6"} Dec 03 12:18:47 crc kubenswrapper[4711]: I1203 12:18:47.451273 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e282247c938f88562bf81cde3e2e458b5aba4cefb3169a24d778937975e244f0"} Dec 03 12:18:47 crc kubenswrapper[4711]: E1203 12:18:47.452063 4711 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:47 crc kubenswrapper[4711]: I1203 12:18:47.452098 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:48 crc kubenswrapper[4711]: E1203 12:18:48.278052 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="3.2s" Dec 03 12:18:49 crc kubenswrapper[4711]: E1203 12:18:49.763877 4711 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187db3ce0d3763d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 12:18:46.565471192 +0000 UTC m=+245.234722487,LastTimestamp:2025-12-03 12:18:46.565471192 +0000 UTC m=+245.234722487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 12:18:51 crc kubenswrapper[4711]: E1203 12:18:51.478707 4711 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="6.4s" Dec 03 12:18:51 crc kubenswrapper[4711]: I1203 12:18:51.820138 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:53 crc kubenswrapper[4711]: I1203 12:18:53.816521 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:53 crc kubenswrapper[4711]: I1203 12:18:53.817395 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:53 crc kubenswrapper[4711]: I1203 12:18:53.830604 4711 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:18:53 crc kubenswrapper[4711]: I1203 12:18:53.830643 4711 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:18:53 crc kubenswrapper[4711]: E1203 12:18:53.831324 4711 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:53 crc kubenswrapper[4711]: I1203 12:18:53.831828 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:54 crc kubenswrapper[4711]: I1203 12:18:54.500997 4711 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e50cd899e25791b45ddad12e060383a3086d3654217aca558fa055fce2615616" exitCode=0 Dec 03 12:18:54 crc kubenswrapper[4711]: I1203 12:18:54.501086 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e50cd899e25791b45ddad12e060383a3086d3654217aca558fa055fce2615616"} Dec 03 12:18:54 crc kubenswrapper[4711]: I1203 12:18:54.501297 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cffdb9af7b704e5d36d5d9a7107672af32915d545f4031ac5b46e75443835f13"} Dec 03 12:18:54 crc kubenswrapper[4711]: I1203 12:18:54.501526 4711 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:18:54 crc kubenswrapper[4711]: I1203 12:18:54.501540 4711 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:18:54 crc kubenswrapper[4711]: I1203 12:18:54.501936 4711 status_manager.go:851] "Failed to get status for pod" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 12:18:54 crc kubenswrapper[4711]: E1203 12:18:54.502039 4711 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:55 crc kubenswrapper[4711]: I1203 12:18:55.511583 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7829ea6550305e57c035e46018d18858a28b32739bc679bcb8345163782637d4"} Dec 03 12:18:55 crc kubenswrapper[4711]: I1203 12:18:55.511944 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fa9ce4d470ea11b192b7eeee1e60a4e557e5f11e199c6649f80eb434e24dc997"} Dec 03 12:18:55 crc kubenswrapper[4711]: I1203 12:18:55.511960 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d9c814d00bb623e5d46ffe7a3fd3809ce6ea2fc668ebf155cacb05c60685db7"} Dec 03 12:18:55 crc kubenswrapper[4711]: I1203 12:18:55.511970 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4fbdb9966b2d82ca15c8307364f36afb76cc875fa5ca3733c0d67751970162bf"} Dec 03 12:18:56 crc kubenswrapper[4711]: I1203 12:18:56.520697 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 12:18:56 crc kubenswrapper[4711]: I1203 12:18:56.520763 4711 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c" exitCode=1 Dec 03 12:18:56 crc kubenswrapper[4711]: I1203 12:18:56.520830 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c"} Dec 03 12:18:56 crc kubenswrapper[4711]: I1203 12:18:56.521401 4711 scope.go:117] "RemoveContainer" containerID="db6afa3e4831acf2ba9ecedab48a28b5ef4adb323b22915cb18811d7e197cc6c" Dec 03 12:18:56 crc kubenswrapper[4711]: I1203 12:18:56.526122 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4006e2742511a3c43790ef0a659f2189f095d99fc5100bd0157cd40a0bb48547"} Dec 03 12:18:56 crc kubenswrapper[4711]: I1203 12:18:56.526485 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:56 crc kubenswrapper[4711]: I1203 12:18:56.526538 4711 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:18:56 crc kubenswrapper[4711]: I1203 12:18:56.526571 4711 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:18:57 crc kubenswrapper[4711]: I1203 12:18:57.200004 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:18:57 crc kubenswrapper[4711]: I1203 12:18:57.535181 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 12:18:57 crc kubenswrapper[4711]: I1203 12:18:57.535244 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e980124c5e876ba67797c09ff3c576b02d5d947d1f14dbf9e8fd57f94b8334f"} Dec 03 12:18:58 crc kubenswrapper[4711]: I1203 12:18:58.832717 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:58 crc kubenswrapper[4711]: I1203 12:18:58.832774 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:58 crc kubenswrapper[4711]: I1203 12:18:58.838180 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:19:01 crc kubenswrapper[4711]: I1203 12:19:01.538056 4711 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:19:01 crc kubenswrapper[4711]: I1203 12:19:01.826272 4711 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4f5ab622-3f1f-4f5d-93e3-9f4b10f3c15c" Dec 03 12:19:02 crc kubenswrapper[4711]: I1203 12:19:02.451880 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:19:02 crc kubenswrapper[4711]: I1203 12:19:02.568808 4711 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:19:02 crc kubenswrapper[4711]: I1203 12:19:02.568845 4711 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:19:02 crc kubenswrapper[4711]: I1203 12:19:02.572143 4711 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4f5ab622-3f1f-4f5d-93e3-9f4b10f3c15c" Dec 03 12:19:02 crc kubenswrapper[4711]: I1203 12:19:02.575203 4711 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://4fbdb9966b2d82ca15c8307364f36afb76cc875fa5ca3733c0d67751970162bf" Dec 03 12:19:02 crc kubenswrapper[4711]: I1203 12:19:02.575252 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:19:03 crc kubenswrapper[4711]: I1203 12:19:03.575068 4711 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:19:03 crc kubenswrapper[4711]: I1203 12:19:03.575362 4711 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c1d3b7df-87c6-4d58-a825-f8b19ad98966" Dec 03 12:19:03 crc kubenswrapper[4711]: I1203 12:19:03.579502 4711 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4f5ab622-3f1f-4f5d-93e3-9f4b10f3c15c" Dec 03 12:19:07 crc kubenswrapper[4711]: I1203 12:19:07.199884 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:19:07 crc kubenswrapper[4711]: I1203 12:19:07.204012 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:19:07 crc kubenswrapper[4711]: I1203 12:19:07.602592 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:19:12 crc kubenswrapper[4711]: I1203 12:19:12.941663 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 12:19:13 crc kubenswrapper[4711]: I1203 12:19:13.008218 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 12:19:13 crc kubenswrapper[4711]: I1203 12:19:13.227390 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 12:19:13 crc kubenswrapper[4711]: I1203 12:19:13.497241 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 12:19:13 crc kubenswrapper[4711]: I1203 12:19:13.534385 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 12:19:13 crc kubenswrapper[4711]: I1203 12:19:13.581783 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 12:19:13 crc kubenswrapper[4711]: I1203 12:19:13.584829 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 12:19:13 crc kubenswrapper[4711]: I1203 12:19:13.608077 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 12:19:13 crc kubenswrapper[4711]: I1203 12:19:13.712742 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.077027 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.095428 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.139109 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.217237 4711 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.356796 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.411656 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.469283 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.588463 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.668530 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.791048 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.832081 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.860449 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 12:19:14 crc kubenswrapper[4711]: I1203 12:19:14.873776 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.164776 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.345307 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.442059 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.463850 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.496788 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.620507 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.681834 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.735649 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.780533 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.793605 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.878518 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.949010 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 12:19:15 crc kubenswrapper[4711]: I1203 12:19:15.996404 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.003525 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.028778 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.062396 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.064852 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.212648 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.250484 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.401204 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.440061 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.686835 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.755464 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.759658 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.817961 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.818217 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 12:19:16 crc kubenswrapper[4711]: I1203 12:19:16.857231 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.031576 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.037274 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.048364 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.155844 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.158899 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.158948 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.222568 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.373864 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.383326 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.383537 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.475999 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.483053 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.557815 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.574147 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.631872 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.656088 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.713668 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.737283 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.784703 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.813115 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.831542 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.873344 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.923802 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.926771 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 12:19:17 crc kubenswrapper[4711]: I1203 12:19:17.927552 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.122795 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.142236 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.159269 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.162985 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.218054 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.226723 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.273481 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.349235 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.359357 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.373271 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.405641 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.472741 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.474463 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.532627 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.566289 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.566289 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.571599 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.692455 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.728684 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.743612 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.802737 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.877557 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.884823 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 12:19:18 crc kubenswrapper[4711]: I1203 12:19:18.947979 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.005312 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.019065 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.108729 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.140703 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.143330 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.165769 4711 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.186761 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.237835 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.317567 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.326812 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.328037 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.487331 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.648831 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.687718 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.727446 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.740497 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.745584 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.749058 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.780628 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.889342 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4711]: I1203 12:19:19.963894 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.080542 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.178132 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.313804 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.325598 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.333062 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.383653 4711 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.424087 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.474326 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.519093 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.610827 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.746519 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.780174 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.888349 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.981428 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 12:19:20 crc kubenswrapper[4711]: I1203 12:19:20.981851 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.011265 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.078318 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.099651 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.101068 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.115007 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.242181 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.277405 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.383701 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.397619 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.421075 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.506858 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.585175 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.703945 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.744511 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.749006 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.762108 4711 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.796880 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.878550 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.883756 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.884611 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.943538 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.955126 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 12:19:21 crc kubenswrapper[4711]: I1203 12:19:21.998119 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.112592 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.213124 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.289488 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.400129 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.424887 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.439717 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.551107 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.642803 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.686650 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.725571 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.751293 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.928922 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.959094 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 12:19:22 crc kubenswrapper[4711]: I1203 12:19:22.960995 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.016012 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.201769 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.368934 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.398764 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.441521 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.465411 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.579623 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.643895 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.665128 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.758575 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.787862 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.802314 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.855480 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.864730 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.882537 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.906945 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 12:19:23 crc kubenswrapper[4711]: I1203 12:19:23.907351 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.006861 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.131217 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.163111 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.230645 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.233005 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.404941 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.515109 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.574748 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.585654 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.605242 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.642748 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.684152 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.712890 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.719981 4711 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.724085 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.724149 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.733317 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.747199 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.763234 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.763212425 podStartE2EDuration="23.763212425s" podCreationTimestamp="2025-12-03 12:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:19:24.744226294 +0000 UTC m=+283.413477569" watchObservedRunningTime="2025-12-03 12:19:24.763212425 +0000 UTC m=+283.432463680" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.881096 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 12:19:24 crc kubenswrapper[4711]: I1203 12:19:24.986464 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.201043 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.236735 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.248644 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.621487 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.704814 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.718134 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.833377 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.862058 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.887929 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.918946 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 12:19:25 crc kubenswrapper[4711]: I1203 12:19:25.937151 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.046575 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.163607 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.270863 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.316144 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.354299 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.419795 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.491978 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.625514 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.712662 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.932676 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.954680 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.971366 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 12:19:26 crc kubenswrapper[4711]: I1203 12:19:26.991526 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.021424 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.069296 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.130672 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.150952 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.273902 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.296024 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.319785 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.422076 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.460279 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.616302 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 12:19:27 crc kubenswrapper[4711]: I1203 12:19:27.968884 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 12:19:28 crc kubenswrapper[4711]: I1203 12:19:28.001307 4711 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 12:19:28 crc kubenswrapper[4711]: I1203 12:19:28.123442 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 12:19:28 crc kubenswrapper[4711]: I1203 12:19:28.666623 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 12:19:28 crc kubenswrapper[4711]: I1203 12:19:28.714770 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 12:19:28 crc kubenswrapper[4711]: I1203 12:19:28.798187 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 12:19:28 crc kubenswrapper[4711]: I1203 12:19:28.983486 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 12:19:29 crc kubenswrapper[4711]: I1203 12:19:29.036390 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 12:19:29 crc kubenswrapper[4711]: I1203 12:19:29.100075 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 12:19:29 crc kubenswrapper[4711]: I1203 12:19:29.106811 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 12:19:29 crc kubenswrapper[4711]: I1203 12:19:29.771899 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 12:19:29 crc kubenswrapper[4711]: I1203 12:19:29.933830 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 12:19:31 crc kubenswrapper[4711]: I1203 12:19:31.084146 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 12:19:35 crc kubenswrapper[4711]: I1203 12:19:35.171364 4711 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 12:19:35 crc kubenswrapper[4711]: I1203 12:19:35.172974 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6" gracePeriod=5 Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.749516 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.749822 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.773237 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.773281 4711 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6" exitCode=137 Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.773319 4711 scope.go:117] "RemoveContainer" containerID="45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.773346 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.787897 4711 scope.go:117] "RemoveContainer" containerID="45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6" Dec 03 12:19:40 crc kubenswrapper[4711]: E1203 12:19:40.788320 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6\": container with ID starting with 45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6 not found: ID does not exist" containerID="45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.788367 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6"} err="failed to get container status \"45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6\": rpc error: code = NotFound desc = could not find container \"45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6\": container with ID starting with 45eba17783606933821410d2b5e4e9508a24b176f6b97ac1dfd18473b8d1fdf6 not found: ID does not exist" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.803878 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.803983 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804006 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804030 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804038 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804072 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804029 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804094 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804050 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804534 4711 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804552 4711 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804560 4711 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.804568 4711 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.819252 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:19:40 crc kubenswrapper[4711]: I1203 12:19:40.906174 4711 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:41 crc kubenswrapper[4711]: I1203 12:19:41.829878 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 12:19:48 crc kubenswrapper[4711]: I1203 12:19:48.824585 4711 generic.go:334] "Generic (PLEG): container finished" podID="9bc96e8b-e5df-40a7-8690-530895999a16" containerID="b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea" exitCode=0 Dec 03 12:19:48 crc kubenswrapper[4711]: I1203 12:19:48.824716 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" event={"ID":"9bc96e8b-e5df-40a7-8690-530895999a16","Type":"ContainerDied","Data":"b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea"} Dec 03 12:19:48 crc kubenswrapper[4711]: I1203 12:19:48.825159 4711 scope.go:117] "RemoveContainer" containerID="b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea" Dec 03 12:19:49 crc kubenswrapper[4711]: I1203 12:19:49.832399 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" event={"ID":"9bc96e8b-e5df-40a7-8690-530895999a16","Type":"ContainerStarted","Data":"25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf"} Dec 03 12:19:49 crc kubenswrapper[4711]: I1203 12:19:49.832772 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:19:49 crc kubenswrapper[4711]: I1203 12:19:49.835985 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:20:35 crc kubenswrapper[4711]: I1203 12:20:35.402232 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:20:35 crc kubenswrapper[4711]: I1203 12:20:35.402834 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:21:05 crc kubenswrapper[4711]: I1203 12:21:05.401851 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:21:05 crc kubenswrapper[4711]: I1203 12:21:05.402390 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:21:35 crc kubenswrapper[4711]: I1203 12:21:35.401599 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:21:35 crc kubenswrapper[4711]: I1203 12:21:35.402756 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:21:35 crc kubenswrapper[4711]: I1203 12:21:35.402847 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:21:35 crc kubenswrapper[4711]: I1203 12:21:35.403970 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"439acc8b24a33cb172441cb0bf7aad33e2c32df929a12c484a27ad537b2e6d97"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:21:35 crc kubenswrapper[4711]: I1203 12:21:35.404052 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://439acc8b24a33cb172441cb0bf7aad33e2c32df929a12c484a27ad537b2e6d97" gracePeriod=600 Dec 03 12:21:36 crc kubenswrapper[4711]: I1203 12:21:36.438741 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="439acc8b24a33cb172441cb0bf7aad33e2c32df929a12c484a27ad537b2e6d97" exitCode=0 Dec 03 12:21:36 crc kubenswrapper[4711]: I1203 12:21:36.438835 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"439acc8b24a33cb172441cb0bf7aad33e2c32df929a12c484a27ad537b2e6d97"} Dec 03 12:21:36 crc kubenswrapper[4711]: I1203 12:21:36.439320 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"3e61869a3ac184a35d7e1cec2aece16248296612c02f9fd68dd3a7f8a6d0452f"} Dec 03 12:21:36 crc kubenswrapper[4711]: I1203 12:21:36.439347 4711 scope.go:117] "RemoveContainer" containerID="d82c5517a78be53885c5d2e8a414e572be5defba4fb0e90131d85a364814f137" Dec 03 12:21:43 crc kubenswrapper[4711]: I1203 12:21:43.247957 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bfrlg"] Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.281383 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" podUID="4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" containerName="oauth-openshift" containerID="cri-o://69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618" gracePeriod=15 Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.613983 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.632873 4711 generic.go:334] "Generic (PLEG): container finished" podID="4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" containerID="69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618" exitCode=0 Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.632968 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" event={"ID":"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b","Type":"ContainerDied","Data":"69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618"} Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.632975 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.633005 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bfrlg" event={"ID":"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b","Type":"ContainerDied","Data":"baf1f49578cc09772b352d62c0fd8a37130dc9d44303cdb1797a8af8200deb95"} Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.633030 4711 scope.go:117] "RemoveContainer" containerID="69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.655027 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7d6fbddb84-9w85v"] Dec 03 12:22:08 crc kubenswrapper[4711]: E1203 12:22:08.655240 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.655251 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 12:22:08 crc kubenswrapper[4711]: E1203 12:22:08.655259 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" containerName="oauth-openshift" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.655266 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" containerName="oauth-openshift" Dec 03 12:22:08 crc kubenswrapper[4711]: E1203 12:22:08.655273 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" containerName="installer" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.655279 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" containerName="installer" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.655467 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.655577 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" containerName="oauth-openshift" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.655633 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c22280-fb06-40d1-9599-ce73f2d137d9" containerName="installer" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.656133 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.666254 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d6fbddb84-9w85v"] Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.680571 4711 scope.go:117] "RemoveContainer" containerID="69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618" Dec 03 12:22:08 crc kubenswrapper[4711]: E1203 12:22:08.681197 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618\": container with ID starting with 69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618 not found: ID does not exist" containerID="69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.681278 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618"} err="failed to get container status \"69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618\": rpc error: code = NotFound desc = could not find container \"69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618\": container with ID starting with 69b8bdba0bc8c1aebec1ee8672cdeb9f982ed8c1db27a73ca92da5e46e6e4618 not found: ID does not exist" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727297 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-router-certs\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727389 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-ocp-branding-template\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727462 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-provider-selection\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727541 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-service-ca\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727608 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-policies\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727683 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-session\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727709 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-serving-cert\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727748 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-login\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727774 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-dir\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727822 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9jv\" (UniqueName: \"kubernetes.io/projected/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-kube-api-access-vq9jv\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727860 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-trusted-ca-bundle\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727897 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-error\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727957 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-idp-0-file-data\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.727992 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-cliconfig\") pod \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\" (UID: \"4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b\") " Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.729145 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.730343 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.730343 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731048 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731073 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731215 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731327 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731359 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdz7d\" (UniqueName: \"kubernetes.io/projected/9001fac8-0086-4c69-8b2a-770b84b4f767-kube-api-access-tdz7d\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731439 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-login\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731643 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9001fac8-0086-4c69-8b2a-770b84b4f767-audit-dir\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731749 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731857 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-audit-policies\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731919 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-error\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.731953 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732000 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732134 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732208 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-session\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732317 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732402 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732488 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732507 4711 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732519 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732531 4711 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.732543 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.737535 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.737839 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-kube-api-access-vq9jv" (OuterVolumeSpecName: "kube-api-access-vq9jv") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "kube-api-access-vq9jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.738045 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.738298 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.740513 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.741205 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.741384 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.741623 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.741898 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" (UID: "4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.833927 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9001fac8-0086-4c69-8b2a-770b84b4f767-audit-dir\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834043 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834036 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9001fac8-0086-4c69-8b2a-770b84b4f767-audit-dir\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834093 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-audit-policies\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834122 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-error\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834148 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834181 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834221 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834247 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-session\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834285 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834317 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834346 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834376 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834435 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdz7d\" (UniqueName: \"kubernetes.io/projected/9001fac8-0086-4c69-8b2a-770b84b4f767-kube-api-access-tdz7d\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834472 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-login\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834529 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq9jv\" (UniqueName: \"kubernetes.io/projected/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-kube-api-access-vq9jv\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834546 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834563 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834578 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834594 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834608 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834622 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834636 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.834648 4711 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.835837 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.836897 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.837299 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-audit-policies\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.838072 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.838185 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.839817 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.840317 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.841005 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-login\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.841224 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.841347 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-system-session\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.842277 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.842477 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9001fac8-0086-4c69-8b2a-770b84b4f767-v4-0-config-user-template-error\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.853360 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdz7d\" (UniqueName: \"kubernetes.io/projected/9001fac8-0086-4c69-8b2a-770b84b4f767-kube-api-access-tdz7d\") pod \"oauth-openshift-7d6fbddb84-9w85v\" (UID: \"9001fac8-0086-4c69-8b2a-770b84b4f767\") " pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.970858 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bfrlg"] Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.974415 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bfrlg"] Dec 03 12:22:08 crc kubenswrapper[4711]: I1203 12:22:08.974783 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:09 crc kubenswrapper[4711]: I1203 12:22:09.170586 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d6fbddb84-9w85v"] Dec 03 12:22:09 crc kubenswrapper[4711]: I1203 12:22:09.650099 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" event={"ID":"9001fac8-0086-4c69-8b2a-770b84b4f767","Type":"ContainerStarted","Data":"db2e37df79bf7a661337012adf2b42e78b29a34e657651a0894355a5e0e5c654"} Dec 03 12:22:09 crc kubenswrapper[4711]: I1203 12:22:09.650161 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" event={"ID":"9001fac8-0086-4c69-8b2a-770b84b4f767","Type":"ContainerStarted","Data":"d33e9c08df8d88eba39ed07e94cc4621452530bf735174137d51f12d16a4f28f"} Dec 03 12:22:09 crc kubenswrapper[4711]: I1203 12:22:09.827082 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b" path="/var/lib/kubelet/pods/4cfaca9b-aed0-4d3d-a9e9-2e987f2ddb7b/volumes" Dec 03 12:22:10 crc kubenswrapper[4711]: I1203 12:22:10.655764 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:10 crc kubenswrapper[4711]: I1203 12:22:10.666263 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" Dec 03 12:22:10 crc kubenswrapper[4711]: I1203 12:22:10.685205 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7d6fbddb84-9w85v" podStartSLOduration=27.685174763 podStartE2EDuration="27.685174763s" podCreationTimestamp="2025-12-03 12:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:22:10.683223 +0000 UTC m=+449.352474275" watchObservedRunningTime="2025-12-03 12:22:10.685174763 +0000 UTC m=+449.354426018" Dec 03 12:23:35 crc kubenswrapper[4711]: I1203 12:23:35.401670 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:23:35 crc kubenswrapper[4711]: I1203 12:23:35.402468 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:24:05 crc kubenswrapper[4711]: I1203 12:24:05.401520 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:24:05 crc kubenswrapper[4711]: I1203 12:24:05.402529 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:24:35 crc kubenswrapper[4711]: I1203 12:24:35.402134 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:24:35 crc kubenswrapper[4711]: I1203 12:24:35.402805 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:24:35 crc kubenswrapper[4711]: I1203 12:24:35.402860 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:24:35 crc kubenswrapper[4711]: I1203 12:24:35.403485 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e61869a3ac184a35d7e1cec2aece16248296612c02f9fd68dd3a7f8a6d0452f"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:24:35 crc kubenswrapper[4711]: I1203 12:24:35.403539 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://3e61869a3ac184a35d7e1cec2aece16248296612c02f9fd68dd3a7f8a6d0452f" gracePeriod=600 Dec 03 12:24:36 crc kubenswrapper[4711]: I1203 12:24:36.511272 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="3e61869a3ac184a35d7e1cec2aece16248296612c02f9fd68dd3a7f8a6d0452f" exitCode=0 Dec 03 12:24:36 crc kubenswrapper[4711]: I1203 12:24:36.511342 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"3e61869a3ac184a35d7e1cec2aece16248296612c02f9fd68dd3a7f8a6d0452f"} Dec 03 12:24:36 crc kubenswrapper[4711]: I1203 12:24:36.512355 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"353bb382f540c9af7fa95771c36b38b3219d1db4c595540997493e382b5d1428"} Dec 03 12:24:36 crc kubenswrapper[4711]: I1203 12:24:36.512394 4711 scope.go:117] "RemoveContainer" containerID="439acc8b24a33cb172441cb0bf7aad33e2c32df929a12c484a27ad537b2e6d97" Dec 03 12:26:34 crc kubenswrapper[4711]: I1203 12:26:34.882263 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wqd2m"] Dec 03 12:26:34 crc kubenswrapper[4711]: I1203 12:26:34.884353 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:34 crc kubenswrapper[4711]: I1203 12:26:34.893100 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wqd2m"] Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.052255 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11906581-9395-40f3-bc91-c3f0b9b1eea4-registry-certificates\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.052303 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11906581-9395-40f3-bc91-c3f0b9b1eea4-trusted-ca\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.052339 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-bound-sa-token\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.052381 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.052410 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11906581-9395-40f3-bc91-c3f0b9b1eea4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.052437 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l65gd\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-kube-api-access-l65gd\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.052487 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11906581-9395-40f3-bc91-c3f0b9b1eea4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.052515 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-registry-tls\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.074699 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.154005 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-bound-sa-token\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.154071 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11906581-9395-40f3-bc91-c3f0b9b1eea4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.154093 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l65gd\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-kube-api-access-l65gd\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.154126 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11906581-9395-40f3-bc91-c3f0b9b1eea4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.154148 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-registry-tls\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.154192 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11906581-9395-40f3-bc91-c3f0b9b1eea4-registry-certificates\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.154206 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11906581-9395-40f3-bc91-c3f0b9b1eea4-trusted-ca\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.155433 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11906581-9395-40f3-bc91-c3f0b9b1eea4-trusted-ca\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.155421 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11906581-9395-40f3-bc91-c3f0b9b1eea4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.156722 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11906581-9395-40f3-bc91-c3f0b9b1eea4-registry-certificates\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.162658 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-registry-tls\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.162771 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11906581-9395-40f3-bc91-c3f0b9b1eea4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.174654 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-bound-sa-token\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.174849 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l65gd\" (UniqueName: \"kubernetes.io/projected/11906581-9395-40f3-bc91-c3f0b9b1eea4-kube-api-access-l65gd\") pod \"image-registry-66df7c8f76-wqd2m\" (UID: \"11906581-9395-40f3-bc91-c3f0b9b1eea4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.205847 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:35 crc kubenswrapper[4711]: I1203 12:26:35.401235 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wqd2m"] Dec 03 12:26:36 crc kubenswrapper[4711]: I1203 12:26:36.242687 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" event={"ID":"11906581-9395-40f3-bc91-c3f0b9b1eea4","Type":"ContainerStarted","Data":"7241a0041a4ad4423d996454d2b5b41b42300c7d994008c25abafea7a3042e7c"} Dec 03 12:26:36 crc kubenswrapper[4711]: I1203 12:26:36.243114 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:36 crc kubenswrapper[4711]: I1203 12:26:36.243129 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" event={"ID":"11906581-9395-40f3-bc91-c3f0b9b1eea4","Type":"ContainerStarted","Data":"876686689f265b7ffabe647bbcb293e835235033a0ce6e51659bf2379763a73c"} Dec 03 12:26:36 crc kubenswrapper[4711]: I1203 12:26:36.261670 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" podStartSLOduration=2.261644044 podStartE2EDuration="2.261644044s" podCreationTimestamp="2025-12-03 12:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:26:36.259109075 +0000 UTC m=+714.928360340" watchObservedRunningTime="2025-12-03 12:26:36.261644044 +0000 UTC m=+714.930895339" Dec 03 12:26:55 crc kubenswrapper[4711]: I1203 12:26:55.211783 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wqd2m" Dec 03 12:26:55 crc kubenswrapper[4711]: I1203 12:26:55.294252 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sjc5j"] Dec 03 12:27:05 crc kubenswrapper[4711]: I1203 12:27:05.401534 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:27:05 crc kubenswrapper[4711]: I1203 12:27:05.402129 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.343361 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" podUID="e58cccb4-9efa-494f-98c3-fbf61d444804" containerName="registry" containerID="cri-o://e7931db9c66da5319a22fe22c95d85d77a16d46818f1ca9bf4cb413fdc4b0b0a" gracePeriod=30 Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.494311 4711 generic.go:334] "Generic (PLEG): container finished" podID="e58cccb4-9efa-494f-98c3-fbf61d444804" containerID="e7931db9c66da5319a22fe22c95d85d77a16d46818f1ca9bf4cb413fdc4b0b0a" exitCode=0 Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.494386 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" event={"ID":"e58cccb4-9efa-494f-98c3-fbf61d444804","Type":"ContainerDied","Data":"e7931db9c66da5319a22fe22c95d85d77a16d46818f1ca9bf4cb413fdc4b0b0a"} Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.701935 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.729816 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e58cccb4-9efa-494f-98c3-fbf61d444804\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.729938 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-tls\") pod \"e58cccb4-9efa-494f-98c3-fbf61d444804\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.729994 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e58cccb4-9efa-494f-98c3-fbf61d444804-installation-pull-secrets\") pod \"e58cccb4-9efa-494f-98c3-fbf61d444804\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.730026 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-certificates\") pod \"e58cccb4-9efa-494f-98c3-fbf61d444804\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.730058 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-bound-sa-token\") pod \"e58cccb4-9efa-494f-98c3-fbf61d444804\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.730094 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e58cccb4-9efa-494f-98c3-fbf61d444804-ca-trust-extracted\") pod \"e58cccb4-9efa-494f-98c3-fbf61d444804\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.730117 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-trusted-ca\") pod \"e58cccb4-9efa-494f-98c3-fbf61d444804\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.730149 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp4n7\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-kube-api-access-cp4n7\") pod \"e58cccb4-9efa-494f-98c3-fbf61d444804\" (UID: \"e58cccb4-9efa-494f-98c3-fbf61d444804\") " Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.731299 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e58cccb4-9efa-494f-98c3-fbf61d444804" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.731729 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e58cccb4-9efa-494f-98c3-fbf61d444804" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.738498 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e58cccb4-9efa-494f-98c3-fbf61d444804-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e58cccb4-9efa-494f-98c3-fbf61d444804" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.739282 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e58cccb4-9efa-494f-98c3-fbf61d444804" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.743272 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e58cccb4-9efa-494f-98c3-fbf61d444804" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.743649 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-kube-api-access-cp4n7" (OuterVolumeSpecName: "kube-api-access-cp4n7") pod "e58cccb4-9efa-494f-98c3-fbf61d444804" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804"). InnerVolumeSpecName "kube-api-access-cp4n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.744231 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e58cccb4-9efa-494f-98c3-fbf61d444804" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.751579 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58cccb4-9efa-494f-98c3-fbf61d444804-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e58cccb4-9efa-494f-98c3-fbf61d444804" (UID: "e58cccb4-9efa-494f-98c3-fbf61d444804"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.831553 4711 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e58cccb4-9efa-494f-98c3-fbf61d444804-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.831831 4711 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.831900 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp4n7\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-kube-api-access-cp4n7\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.832028 4711 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.832126 4711 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e58cccb4-9efa-494f-98c3-fbf61d444804-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.832192 4711 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e58cccb4-9efa-494f-98c3-fbf61d444804-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:20 crc kubenswrapper[4711]: I1203 12:27:20.832253 4711 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e58cccb4-9efa-494f-98c3-fbf61d444804-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:21 crc kubenswrapper[4711]: I1203 12:27:21.319785 4711 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 12:27:21 crc kubenswrapper[4711]: I1203 12:27:21.500798 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" event={"ID":"e58cccb4-9efa-494f-98c3-fbf61d444804","Type":"ContainerDied","Data":"82449a86e4ad3618ca2f5d52c23fc3831346f9a66c3eb2fffa6b2635759c23de"} Dec 03 12:27:21 crc kubenswrapper[4711]: I1203 12:27:21.500845 4711 scope.go:117] "RemoveContainer" containerID="e7931db9c66da5319a22fe22c95d85d77a16d46818f1ca9bf4cb413fdc4b0b0a" Dec 03 12:27:21 crc kubenswrapper[4711]: I1203 12:27:21.500882 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sjc5j" Dec 03 12:27:21 crc kubenswrapper[4711]: I1203 12:27:21.532844 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sjc5j"] Dec 03 12:27:21 crc kubenswrapper[4711]: I1203 12:27:21.543575 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sjc5j"] Dec 03 12:27:21 crc kubenswrapper[4711]: I1203 12:27:21.823741 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e58cccb4-9efa-494f-98c3-fbf61d444804" path="/var/lib/kubelet/pods/e58cccb4-9efa-494f-98c3-fbf61d444804/volumes" Dec 03 12:27:35 crc kubenswrapper[4711]: I1203 12:27:35.401170 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:27:35 crc kubenswrapper[4711]: I1203 12:27:35.401691 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.401832 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.402664 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.402737 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.403595 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"353bb382f540c9af7fa95771c36b38b3219d1db4c595540997493e382b5d1428"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.403659 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://353bb382f540c9af7fa95771c36b38b3219d1db4c595540997493e382b5d1428" gracePeriod=600 Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.814863 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="353bb382f540c9af7fa95771c36b38b3219d1db4c595540997493e382b5d1428" exitCode=0 Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.814964 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"353bb382f540c9af7fa95771c36b38b3219d1db4c595540997493e382b5d1428"} Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.815396 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"ba490530ff3c59515e7bc84e7f01736c013e66e1dc8f01fbdfa3312d40c3ab5f"} Dec 03 12:28:05 crc kubenswrapper[4711]: I1203 12:28:05.815431 4711 scope.go:117] "RemoveContainer" containerID="3e61869a3ac184a35d7e1cec2aece16248296612c02f9fd68dd3a7f8a6d0452f" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.189731 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zrh5"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.191035 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zrh5" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerName="registry-server" containerID="cri-o://7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3" gracePeriod=30 Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.193295 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wz5c"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.193524 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5wz5c" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerName="registry-server" containerID="cri-o://06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54" gracePeriod=30 Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.211714 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c8g7k"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.211962 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" containerID="cri-o://25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf" gracePeriod=30 Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.218569 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjr4q"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.218841 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rjr4q" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="registry-server" containerID="cri-o://54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e" gracePeriod=30 Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.223137 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lx4g4"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.223449 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lx4g4" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="registry-server" containerID="cri-o://956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27" gracePeriod=30 Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.246944 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sr2lt"] Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.247204 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e58cccb4-9efa-494f-98c3-fbf61d444804" containerName="registry" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.247223 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58cccb4-9efa-494f-98c3-fbf61d444804" containerName="registry" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.247355 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e58cccb4-9efa-494f-98c3-fbf61d444804" containerName="registry" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.247749 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n6q2z"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.247837 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.248690 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.255493 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sr2lt"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.262366 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6q2z"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.393836 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-utilities\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.393934 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a0a6697-5cb1-4381-a050-ce15daaeb231-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.393974 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsjjd\" (UniqueName: \"kubernetes.io/projected/3a0a6697-5cb1-4381-a050-ce15daaeb231-kube-api-access-fsjjd\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.394002 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3a0a6697-5cb1-4381-a050-ce15daaeb231-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.394055 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4g44\" (UniqueName: \"kubernetes.io/projected/9638f6ab-2a76-4c44-9100-237e91ad4f6f-kube-api-access-h4g44\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.394108 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-catalog-content\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.495669 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4g44\" (UniqueName: \"kubernetes.io/projected/9638f6ab-2a76-4c44-9100-237e91ad4f6f-kube-api-access-h4g44\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.495794 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-catalog-content\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.495836 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-utilities\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.495885 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a0a6697-5cb1-4381-a050-ce15daaeb231-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.495930 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsjjd\" (UniqueName: \"kubernetes.io/projected/3a0a6697-5cb1-4381-a050-ce15daaeb231-kube-api-access-fsjjd\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.495958 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3a0a6697-5cb1-4381-a050-ce15daaeb231-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.498832 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-utilities\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.499221 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-catalog-content\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.502390 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a0a6697-5cb1-4381-a050-ce15daaeb231-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.509863 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3a0a6697-5cb1-4381-a050-ce15daaeb231-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.517121 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4g44\" (UniqueName: \"kubernetes.io/projected/9638f6ab-2a76-4c44-9100-237e91ad4f6f-kube-api-access-h4g44\") pod \"certified-operators-n6q2z\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.521461 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsjjd\" (UniqueName: \"kubernetes.io/projected/3a0a6697-5cb1-4381-a050-ce15daaeb231-kube-api-access-fsjjd\") pod \"marketplace-operator-79b997595-sr2lt\" (UID: \"3a0a6697-5cb1-4381-a050-ce15daaeb231\") " pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.604894 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.617416 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.622891 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.656803 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.663103 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.686526 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.689220 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700101 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhxh8\" (UniqueName: \"kubernetes.io/projected/9bc96e8b-e5df-40a7-8690-530895999a16-kube-api-access-jhxh8\") pod \"9bc96e8b-e5df-40a7-8690-530895999a16\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700172 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-catalog-content\") pod \"0908002c-11e7-4804-a059-dbd19dc6d739\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700221 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-operator-metrics\") pod \"9bc96e8b-e5df-40a7-8690-530895999a16\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700575 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb65q\" (UniqueName: \"kubernetes.io/projected/61b13e0f-7d37-469f-bf80-3c4b455c34b0-kube-api-access-rb65q\") pod \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700615 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-utilities\") pod \"559a1102-7df3-414e-9a4c-37cd0d35daf1\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700655 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-catalog-content\") pod \"559a1102-7df3-414e-9a4c-37cd0d35daf1\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700687 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-catalog-content\") pod \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700720 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-utilities\") pod \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700777 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4mgx\" (UniqueName: \"kubernetes.io/projected/0908002c-11e7-4804-a059-dbd19dc6d739-kube-api-access-v4mgx\") pod \"0908002c-11e7-4804-a059-dbd19dc6d739\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700799 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt9cs\" (UniqueName: \"kubernetes.io/projected/559a1102-7df3-414e-9a4c-37cd0d35daf1-kube-api-access-bt9cs\") pod \"559a1102-7df3-414e-9a4c-37cd0d35daf1\" (UID: \"559a1102-7df3-414e-9a4c-37cd0d35daf1\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700827 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-trusted-ca\") pod \"9bc96e8b-e5df-40a7-8690-530895999a16\" (UID: \"9bc96e8b-e5df-40a7-8690-530895999a16\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700851 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-utilities\") pod \"0908002c-11e7-4804-a059-dbd19dc6d739\" (UID: \"0908002c-11e7-4804-a059-dbd19dc6d739\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700876 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxqdb\" (UniqueName: \"kubernetes.io/projected/fff3e844-ad61-4914-aaff-fdd90e3a9a58-kube-api-access-sxqdb\") pod \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700923 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-utilities\") pod \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\" (UID: \"fff3e844-ad61-4914-aaff-fdd90e3a9a58\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.700963 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-catalog-content\") pod \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\" (UID: \"61b13e0f-7d37-469f-bf80-3c4b455c34b0\") " Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.703164 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-utilities" (OuterVolumeSpecName: "utilities") pod "fff3e844-ad61-4914-aaff-fdd90e3a9a58" (UID: "fff3e844-ad61-4914-aaff-fdd90e3a9a58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.712164 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-utilities" (OuterVolumeSpecName: "utilities") pod "559a1102-7df3-414e-9a4c-37cd0d35daf1" (UID: "559a1102-7df3-414e-9a4c-37cd0d35daf1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.712532 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-utilities" (OuterVolumeSpecName: "utilities") pod "61b13e0f-7d37-469f-bf80-3c4b455c34b0" (UID: "61b13e0f-7d37-469f-bf80-3c4b455c34b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.713897 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-utilities" (OuterVolumeSpecName: "utilities") pod "0908002c-11e7-4804-a059-dbd19dc6d739" (UID: "0908002c-11e7-4804-a059-dbd19dc6d739"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.714435 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559a1102-7df3-414e-9a4c-37cd0d35daf1-kube-api-access-bt9cs" (OuterVolumeSpecName: "kube-api-access-bt9cs") pod "559a1102-7df3-414e-9a4c-37cd0d35daf1" (UID: "559a1102-7df3-414e-9a4c-37cd0d35daf1"). InnerVolumeSpecName "kube-api-access-bt9cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.714522 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "9bc96e8b-e5df-40a7-8690-530895999a16" (UID: "9bc96e8b-e5df-40a7-8690-530895999a16"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.717265 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "9bc96e8b-e5df-40a7-8690-530895999a16" (UID: "9bc96e8b-e5df-40a7-8690-530895999a16"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.726018 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc96e8b-e5df-40a7-8690-530895999a16-kube-api-access-jhxh8" (OuterVolumeSpecName: "kube-api-access-jhxh8") pod "9bc96e8b-e5df-40a7-8690-530895999a16" (UID: "9bc96e8b-e5df-40a7-8690-530895999a16"). InnerVolumeSpecName "kube-api-access-jhxh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.726151 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff3e844-ad61-4914-aaff-fdd90e3a9a58-kube-api-access-sxqdb" (OuterVolumeSpecName: "kube-api-access-sxqdb") pod "fff3e844-ad61-4914-aaff-fdd90e3a9a58" (UID: "fff3e844-ad61-4914-aaff-fdd90e3a9a58"). InnerVolumeSpecName "kube-api-access-sxqdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.726186 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0908002c-11e7-4804-a059-dbd19dc6d739-kube-api-access-v4mgx" (OuterVolumeSpecName: "kube-api-access-v4mgx") pod "0908002c-11e7-4804-a059-dbd19dc6d739" (UID: "0908002c-11e7-4804-a059-dbd19dc6d739"). InnerVolumeSpecName "kube-api-access-v4mgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.737204 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b13e0f-7d37-469f-bf80-3c4b455c34b0-kube-api-access-rb65q" (OuterVolumeSpecName: "kube-api-access-rb65q") pod "61b13e0f-7d37-469f-bf80-3c4b455c34b0" (UID: "61b13e0f-7d37-469f-bf80-3c4b455c34b0"). InnerVolumeSpecName "kube-api-access-rb65q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.790478 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0908002c-11e7-4804-a059-dbd19dc6d739" (UID: "0908002c-11e7-4804-a059-dbd19dc6d739"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803177 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803246 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhxh8\" (UniqueName: \"kubernetes.io/projected/9bc96e8b-e5df-40a7-8690-530895999a16-kube-api-access-jhxh8\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803271 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803283 4711 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803294 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb65q\" (UniqueName: \"kubernetes.io/projected/61b13e0f-7d37-469f-bf80-3c4b455c34b0-kube-api-access-rb65q\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803305 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803314 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803326 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4mgx\" (UniqueName: \"kubernetes.io/projected/0908002c-11e7-4804-a059-dbd19dc6d739-kube-api-access-v4mgx\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803338 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt9cs\" (UniqueName: \"kubernetes.io/projected/559a1102-7df3-414e-9a4c-37cd0d35daf1-kube-api-access-bt9cs\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803351 4711 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bc96e8b-e5df-40a7-8690-530895999a16-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803362 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0908002c-11e7-4804-a059-dbd19dc6d739-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.803395 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxqdb\" (UniqueName: \"kubernetes.io/projected/fff3e844-ad61-4914-aaff-fdd90e3a9a58-kube-api-access-sxqdb\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.808663 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61b13e0f-7d37-469f-bf80-3c4b455c34b0" (UID: "61b13e0f-7d37-469f-bf80-3c4b455c34b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.825210 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fff3e844-ad61-4914-aaff-fdd90e3a9a58" (UID: "fff3e844-ad61-4914-aaff-fdd90e3a9a58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830059 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bm8qz"] Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830260 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830275 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830286 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerName="extract-content" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830293 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerName="extract-content" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830305 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerName="extract-utilities" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830311 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerName="extract-utilities" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830323 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="extract-utilities" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830329 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="extract-utilities" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830336 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830342 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830349 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerName="extract-content" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830354 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerName="extract-content" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830362 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerName="extract-utilities" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830369 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerName="extract-utilities" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830378 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830384 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830393 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830398 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830407 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="extract-content" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830413 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="extract-content" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830421 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830427 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830433 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830438 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830445 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="extract-content" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830451 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="extract-content" Dec 03 12:28:39 crc kubenswrapper[4711]: E1203 12:28:39.830460 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="extract-utilities" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830466 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="extract-utilities" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830546 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830557 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830569 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830578 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830587 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerName="registry-server" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.830758 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" containerName="marketplace-operator" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.831246 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bm8qz"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.831324 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.858429 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sr2lt"] Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.898591 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "559a1102-7df3-414e-9a4c-37cd0d35daf1" (UID: "559a1102-7df3-414e-9a4c-37cd0d35daf1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.905227 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-utilities\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.905340 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-catalog-content\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.905395 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmvs\" (UniqueName: \"kubernetes.io/projected/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-kube-api-access-dnmvs\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.905474 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b13e0f-7d37-469f-bf80-3c4b455c34b0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.905492 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559a1102-7df3-414e-9a4c-37cd0d35daf1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:39 crc kubenswrapper[4711]: I1203 12:28:39.905505 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3e844-ad61-4914-aaff-fdd90e3a9a58-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.006130 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmvs\" (UniqueName: \"kubernetes.io/projected/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-kube-api-access-dnmvs\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.006200 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-utilities\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.006242 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-catalog-content\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.006857 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-catalog-content\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.007140 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-utilities\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.015558 4711 generic.go:334] "Generic (PLEG): container finished" podID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" containerID="7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3" exitCode=0 Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.015644 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zrh5" event={"ID":"fff3e844-ad61-4914-aaff-fdd90e3a9a58","Type":"ContainerDied","Data":"7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.015700 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zrh5" event={"ID":"fff3e844-ad61-4914-aaff-fdd90e3a9a58","Type":"ContainerDied","Data":"073afe88fe57646267b73da04a8f4260fb1ae70c4e75248a2bbb3ca6c2139e7b"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.015727 4711 scope.go:117] "RemoveContainer" containerID="7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.015952 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zrh5" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.017231 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" event={"ID":"3a0a6697-5cb1-4381-a050-ce15daaeb231","Type":"ContainerStarted","Data":"5e53b3d5942934ac06d5064e0a0b2f15c9f85352d3301352d61dd2eb5b4e61dd"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.017265 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" event={"ID":"3a0a6697-5cb1-4381-a050-ce15daaeb231","Type":"ContainerStarted","Data":"9e112a4c4aaf6c4a86f938ceeb1251fa2f4311c31a0c7f4f4550cc530b4d9e5c"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.017796 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.019752 4711 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sr2lt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.58:8080/healthz\": dial tcp 10.217.0.58:8080: connect: connection refused" start-of-body= Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.019803 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" podUID="3a0a6697-5cb1-4381-a050-ce15daaeb231" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.58:8080/healthz\": dial tcp 10.217.0.58:8080: connect: connection refused" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.020749 4711 generic.go:334] "Generic (PLEG): container finished" podID="559a1102-7df3-414e-9a4c-37cd0d35daf1" containerID="956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27" exitCode=0 Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.020829 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lx4g4" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.020852 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4g4" event={"ID":"559a1102-7df3-414e-9a4c-37cd0d35daf1","Type":"ContainerDied","Data":"956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.020886 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lx4g4" event={"ID":"559a1102-7df3-414e-9a4c-37cd0d35daf1","Type":"ContainerDied","Data":"ca60723c484f77fcfbbf06f17c6ac8898f63c0157a2379ccd0a016d1e2a89700"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.025780 4711 generic.go:334] "Generic (PLEG): container finished" podID="9bc96e8b-e5df-40a7-8690-530895999a16" containerID="25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf" exitCode=0 Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.025969 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.026459 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" event={"ID":"9bc96e8b-e5df-40a7-8690-530895999a16","Type":"ContainerDied","Data":"25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.026510 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c8g7k" event={"ID":"9bc96e8b-e5df-40a7-8690-530895999a16","Type":"ContainerDied","Data":"be1e6648597ceae7a9cac58db6281d0199ed6f6badadab03e47efd8dd3d7ba2c"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.030117 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmvs\" (UniqueName: \"kubernetes.io/projected/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-kube-api-access-dnmvs\") pod \"community-operators-bm8qz\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.035783 4711 generic.go:334] "Generic (PLEG): container finished" podID="0908002c-11e7-4804-a059-dbd19dc6d739" containerID="54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e" exitCode=0 Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.035976 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjr4q" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.036132 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjr4q" event={"ID":"0908002c-11e7-4804-a059-dbd19dc6d739","Type":"ContainerDied","Data":"54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.036203 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjr4q" event={"ID":"0908002c-11e7-4804-a059-dbd19dc6d739","Type":"ContainerDied","Data":"29027ae30e0d1f37b8a81cbacef24ec9e9bdb46401e721a65c2241ed3b8b3a17"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.040819 4711 generic.go:334] "Generic (PLEG): container finished" podID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" containerID="06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54" exitCode=0 Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.040939 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wz5c" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.040960 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wz5c" event={"ID":"61b13e0f-7d37-469f-bf80-3c4b455c34b0","Type":"ContainerDied","Data":"06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.041364 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wz5c" event={"ID":"61b13e0f-7d37-469f-bf80-3c4b455c34b0","Type":"ContainerDied","Data":"f78418c826d5cfb145a831a282ba6c81568ab0b3dee8da93303158c7c239db90"} Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.043774 4711 scope.go:117] "RemoveContainer" containerID="f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.050603 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zrh5"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.054709 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zrh5"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.056066 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" podStartSLOduration=1.056050755 podStartE2EDuration="1.056050755s" podCreationTimestamp="2025-12-03 12:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:28:40.051309405 +0000 UTC m=+838.720560670" watchObservedRunningTime="2025-12-03 12:28:40.056050755 +0000 UTC m=+838.725302010" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.063459 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wz5c"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.070784 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5wz5c"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.086197 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lx4g4"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.086647 4711 scope.go:117] "RemoveContainer" containerID="7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.090709 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lx4g4"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.097442 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6q2z"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.102835 4711 scope.go:117] "RemoveContainer" containerID="7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.103351 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3\": container with ID starting with 7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3 not found: ID does not exist" containerID="7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.103410 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3"} err="failed to get container status \"7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3\": rpc error: code = NotFound desc = could not find container \"7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3\": container with ID starting with 7701253679bdf34d74e546b732074f80007db322b987baca1c7dcf023646edb3 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.103438 4711 scope.go:117] "RemoveContainer" containerID="f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.104862 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d\": container with ID starting with f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d not found: ID does not exist" containerID="f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.104923 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d"} err="failed to get container status \"f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d\": rpc error: code = NotFound desc = could not find container \"f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d\": container with ID starting with f6f3813e3a544d8b271331d7ee9a8a57a48139ab389dd92614c08f3de8bc958d not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.104952 4711 scope.go:117] "RemoveContainer" containerID="7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.106312 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda\": container with ID starting with 7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda not found: ID does not exist" containerID="7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.106346 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda"} err="failed to get container status \"7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda\": rpc error: code = NotFound desc = could not find container \"7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda\": container with ID starting with 7aadbc818efad99609b5ec0b7a2a59373c831d7b7e8dada3e382eb34e4157eda not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.106371 4711 scope.go:117] "RemoveContainer" containerID="956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.112414 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjr4q"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.119058 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjr4q"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.122332 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c8g7k"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.122969 4711 scope.go:117] "RemoveContainer" containerID="480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.125232 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c8g7k"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.138824 4711 scope.go:117] "RemoveContainer" containerID="289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.153208 4711 scope.go:117] "RemoveContainer" containerID="956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.153643 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27\": container with ID starting with 956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27 not found: ID does not exist" containerID="956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.153696 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27"} err="failed to get container status \"956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27\": rpc error: code = NotFound desc = could not find container \"956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27\": container with ID starting with 956617d0bd7873e1a4f11bd8c6dbefbcf1969a2e17ea1a362dcb09e054e68f27 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.153725 4711 scope.go:117] "RemoveContainer" containerID="480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.154069 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38\": container with ID starting with 480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38 not found: ID does not exist" containerID="480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.154113 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38"} err="failed to get container status \"480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38\": rpc error: code = NotFound desc = could not find container \"480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38\": container with ID starting with 480b9c510ccff28b8b35bcdbbf01dc802aa4e76f7e727ee89921e21110b3ee38 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.154147 4711 scope.go:117] "RemoveContainer" containerID="289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.154436 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5\": container with ID starting with 289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5 not found: ID does not exist" containerID="289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.154460 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5"} err="failed to get container status \"289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5\": rpc error: code = NotFound desc = could not find container \"289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5\": container with ID starting with 289e63ac9e534bb80ffc46bf185e8ce9338aec668de0a77107379eb5f0be5eb5 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.154474 4711 scope.go:117] "RemoveContainer" containerID="25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.176150 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.179316 4711 scope.go:117] "RemoveContainer" containerID="b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.205148 4711 scope.go:117] "RemoveContainer" containerID="25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.205820 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf\": container with ID starting with 25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf not found: ID does not exist" containerID="25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.205863 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf"} err="failed to get container status \"25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf\": rpc error: code = NotFound desc = could not find container \"25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf\": container with ID starting with 25ee83eb1ecb6b22ee0daf5771cd2e5853ee9c9e015b6379e15b9a24e3f1adaf not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.205896 4711 scope.go:117] "RemoveContainer" containerID="b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.206362 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea\": container with ID starting with b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea not found: ID does not exist" containerID="b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.206389 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea"} err="failed to get container status \"b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea\": rpc error: code = NotFound desc = could not find container \"b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea\": container with ID starting with b8cce5bd203fd0bc22fc756a03a79d8882d84cf0cd32cde01cee100fdf2b6dea not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.206412 4711 scope.go:117] "RemoveContainer" containerID="54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.221833 4711 scope.go:117] "RemoveContainer" containerID="791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.240456 4711 scope.go:117] "RemoveContainer" containerID="db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.260393 4711 scope.go:117] "RemoveContainer" containerID="54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.260876 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e\": container with ID starting with 54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e not found: ID does not exist" containerID="54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.260927 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e"} err="failed to get container status \"54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e\": rpc error: code = NotFound desc = could not find container \"54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e\": container with ID starting with 54edfc62586ace4fc50876a6e4e39930a8a706e3d97fa66b3e5c35714404572e not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.260954 4711 scope.go:117] "RemoveContainer" containerID="791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.264227 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3\": container with ID starting with 791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3 not found: ID does not exist" containerID="791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.264269 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3"} err="failed to get container status \"791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3\": rpc error: code = NotFound desc = could not find container \"791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3\": container with ID starting with 791af8e590189b2961887ad5f53184b099d4b22bc0b70873f9d713e1026766b3 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.264301 4711 scope.go:117] "RemoveContainer" containerID="db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.269140 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7\": container with ID starting with db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7 not found: ID does not exist" containerID="db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.269176 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7"} err="failed to get container status \"db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7\": rpc error: code = NotFound desc = could not find container \"db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7\": container with ID starting with db648fe560718a649e6003f03d7f8df1cdfb0eba2f0c2aabbd1e5db66ff0f7d7 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.269202 4711 scope.go:117] "RemoveContainer" containerID="06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.284780 4711 scope.go:117] "RemoveContainer" containerID="43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.306480 4711 scope.go:117] "RemoveContainer" containerID="63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.387878 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bm8qz"] Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.397957 4711 scope.go:117] "RemoveContainer" containerID="06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.398383 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54\": container with ID starting with 06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54 not found: ID does not exist" containerID="06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.398417 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54"} err="failed to get container status \"06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54\": rpc error: code = NotFound desc = could not find container \"06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54\": container with ID starting with 06ea9839e0370a538de1486cc8d4332a6fdedcd721c817a475f171680ac82f54 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.398443 4711 scope.go:117] "RemoveContainer" containerID="43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.398818 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf\": container with ID starting with 43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf not found: ID does not exist" containerID="43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.398854 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf"} err="failed to get container status \"43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf\": rpc error: code = NotFound desc = could not find container \"43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf\": container with ID starting with 43ee3b491b0dc49f89fa80fbca61ac4f3b4ad3a645a896a7512d4add9c8ceccf not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.398884 4711 scope.go:117] "RemoveContainer" containerID="63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6" Dec 03 12:28:40 crc kubenswrapper[4711]: E1203 12:28:40.399271 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6\": container with ID starting with 63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6 not found: ID does not exist" containerID="63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6" Dec 03 12:28:40 crc kubenswrapper[4711]: I1203 12:28:40.399298 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6"} err="failed to get container status \"63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6\": rpc error: code = NotFound desc = could not find container \"63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6\": container with ID starting with 63e1267656731e9f6a41e3f006402611ec318375e5ede795dd39088ba67782f6 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4711]: W1203 12:28:40.402425 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb8fdb7_5812_4474_b2dc_31f00bbbd1ff.slice/crio-2bf6bd7b72f70e420d643e08f29de22bbaf2d26423cbf0417529a7343a15f453 WatchSource:0}: Error finding container 2bf6bd7b72f70e420d643e08f29de22bbaf2d26423cbf0417529a7343a15f453: Status 404 returned error can't find the container with id 2bf6bd7b72f70e420d643e08f29de22bbaf2d26423cbf0417529a7343a15f453 Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.049605 4711 generic.go:334] "Generic (PLEG): container finished" podID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerID="c0a536fe61d5267a2e1f614a204accc0c74993e73d186240f154d39d4a93d2ac" exitCode=0 Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.050040 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm8qz" event={"ID":"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff","Type":"ContainerDied","Data":"c0a536fe61d5267a2e1f614a204accc0c74993e73d186240f154d39d4a93d2ac"} Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.050265 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm8qz" event={"ID":"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff","Type":"ContainerStarted","Data":"2bf6bd7b72f70e420d643e08f29de22bbaf2d26423cbf0417529a7343a15f453"} Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.051546 4711 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.068421 4711 generic.go:334] "Generic (PLEG): container finished" podID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerID="879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055" exitCode=0 Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.068487 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6q2z" event={"ID":"9638f6ab-2a76-4c44-9100-237e91ad4f6f","Type":"ContainerDied","Data":"879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055"} Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.068545 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6q2z" event={"ID":"9638f6ab-2a76-4c44-9100-237e91ad4f6f","Type":"ContainerStarted","Data":"ff1ed6e1014ddb28943cb2d494d94dbf141d7dbeda3857ae6dec823840339402"} Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.072181 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sr2lt" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.601900 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqcm"] Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.603190 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.611292 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.627705 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqcm"] Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.727515 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjt6\" (UniqueName: \"kubernetes.io/projected/de20138d-da5d-4975-b68e-cbd74b7be8cb-kube-api-access-7cjt6\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.727682 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de20138d-da5d-4975-b68e-cbd74b7be8cb-utilities\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.727714 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de20138d-da5d-4975-b68e-cbd74b7be8cb-catalog-content\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.826129 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0908002c-11e7-4804-a059-dbd19dc6d739" path="/var/lib/kubelet/pods/0908002c-11e7-4804-a059-dbd19dc6d739/volumes" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.827945 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559a1102-7df3-414e-9a4c-37cd0d35daf1" path="/var/lib/kubelet/pods/559a1102-7df3-414e-9a4c-37cd0d35daf1/volumes" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.828386 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de20138d-da5d-4975-b68e-cbd74b7be8cb-utilities\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.828442 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de20138d-da5d-4975-b68e-cbd74b7be8cb-catalog-content\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.828491 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjt6\" (UniqueName: \"kubernetes.io/projected/de20138d-da5d-4975-b68e-cbd74b7be8cb-kube-api-access-7cjt6\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.828772 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b13e0f-7d37-469f-bf80-3c4b455c34b0" path="/var/lib/kubelet/pods/61b13e0f-7d37-469f-bf80-3c4b455c34b0/volumes" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.829071 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de20138d-da5d-4975-b68e-cbd74b7be8cb-utilities\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.829142 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de20138d-da5d-4975-b68e-cbd74b7be8cb-catalog-content\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.833288 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc96e8b-e5df-40a7-8690-530895999a16" path="/var/lib/kubelet/pods/9bc96e8b-e5df-40a7-8690-530895999a16/volumes" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.834351 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff3e844-ad61-4914-aaff-fdd90e3a9a58" path="/var/lib/kubelet/pods/fff3e844-ad61-4914-aaff-fdd90e3a9a58/volumes" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.850250 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjt6\" (UniqueName: \"kubernetes.io/projected/de20138d-da5d-4975-b68e-cbd74b7be8cb-kube-api-access-7cjt6\") pod \"redhat-marketplace-wrqcm\" (UID: \"de20138d-da5d-4975-b68e-cbd74b7be8cb\") " pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.928149 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 12:28:41 crc kubenswrapper[4711]: I1203 12:28:41.935791 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.000977 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kcwq6"] Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.001857 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.013682 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcwq6"] Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.032797 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-utilities\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.032854 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-catalog-content\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.032877 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5scc\" (UniqueName: \"kubernetes.io/projected/20c9cd89-1c58-47a2-941a-76091eda8baa-kube-api-access-w5scc\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.129251 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqcm"] Dec 03 12:28:42 crc kubenswrapper[4711]: W1203 12:28:42.132527 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde20138d_da5d_4975_b68e_cbd74b7be8cb.slice/crio-bc1b9c489bf7b5c23882ce8f5bf9ffcabad7a975c1c017e599425f8c6323555f WatchSource:0}: Error finding container bc1b9c489bf7b5c23882ce8f5bf9ffcabad7a975c1c017e599425f8c6323555f: Status 404 returned error can't find the container with id bc1b9c489bf7b5c23882ce8f5bf9ffcabad7a975c1c017e599425f8c6323555f Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.133345 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-catalog-content\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.133378 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5scc\" (UniqueName: \"kubernetes.io/projected/20c9cd89-1c58-47a2-941a-76091eda8baa-kube-api-access-w5scc\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.133461 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-utilities\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.134391 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-utilities\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.134557 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-catalog-content\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.155056 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5scc\" (UniqueName: \"kubernetes.io/projected/20c9cd89-1c58-47a2-941a-76091eda8baa-kube-api-access-w5scc\") pod \"redhat-marketplace-kcwq6\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.202925 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tltll"] Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.205495 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.208378 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.211242 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tltll"] Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.234370 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-catalog-content\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.234450 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-utilities\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.234491 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mh6b\" (UniqueName: \"kubernetes.io/projected/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-kube-api-access-4mh6b\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.335691 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-catalog-content\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.335790 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-utilities\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.335838 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mh6b\" (UniqueName: \"kubernetes.io/projected/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-kube-api-access-4mh6b\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.336466 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-catalog-content\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.336552 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-utilities\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.346323 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.356247 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mh6b\" (UniqueName: \"kubernetes.io/projected/8b4e35ef-e95e-42b5-ae3d-f5f92bfec867-kube-api-access-4mh6b\") pod \"redhat-operators-tltll\" (UID: \"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867\") " pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.542449 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.604990 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q7vzf"] Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.610138 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcwq6"] Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.610269 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.642838 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7vzf"] Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.744746 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-catalog-content\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.744793 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-utilities\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.744823 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd4zc\" (UniqueName: \"kubernetes.io/projected/357500e9-9abf-48f2-a476-ccecf371bfbe-kube-api-access-dd4zc\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.831051 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tltll"] Dec 03 12:28:42 crc kubenswrapper[4711]: W1203 12:28:42.836857 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b4e35ef_e95e_42b5_ae3d_f5f92bfec867.slice/crio-5cd33b649f7b7c4bc787c9a22df697cfd49dec097adca79bc6d7fdbc2fe814fd WatchSource:0}: Error finding container 5cd33b649f7b7c4bc787c9a22df697cfd49dec097adca79bc6d7fdbc2fe814fd: Status 404 returned error can't find the container with id 5cd33b649f7b7c4bc787c9a22df697cfd49dec097adca79bc6d7fdbc2fe814fd Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.845474 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-utilities\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.845532 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd4zc\" (UniqueName: \"kubernetes.io/projected/357500e9-9abf-48f2-a476-ccecf371bfbe-kube-api-access-dd4zc\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.845583 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-catalog-content\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.846043 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-catalog-content\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.846086 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-utilities\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.870630 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd4zc\" (UniqueName: \"kubernetes.io/projected/357500e9-9abf-48f2-a476-ccecf371bfbe-kube-api-access-dd4zc\") pod \"redhat-operators-q7vzf\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:42 crc kubenswrapper[4711]: I1203 12:28:42.935872 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.085409 4711 generic.go:334] "Generic (PLEG): container finished" podID="de20138d-da5d-4975-b68e-cbd74b7be8cb" containerID="dd0843fbd1883af3f651fa9311657b63a1ba42684ef660d142c8f62628e46052" exitCode=0 Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.085796 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqcm" event={"ID":"de20138d-da5d-4975-b68e-cbd74b7be8cb","Type":"ContainerDied","Data":"dd0843fbd1883af3f651fa9311657b63a1ba42684ef660d142c8f62628e46052"} Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.085861 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqcm" event={"ID":"de20138d-da5d-4975-b68e-cbd74b7be8cb","Type":"ContainerStarted","Data":"bc1b9c489bf7b5c23882ce8f5bf9ffcabad7a975c1c017e599425f8c6323555f"} Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.089236 4711 generic.go:334] "Generic (PLEG): container finished" podID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerID="3a48546805a8f45bcf1ed08cf0c58a95dfdfdec94782808f3a0312479f13c214" exitCode=0 Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.089320 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcwq6" event={"ID":"20c9cd89-1c58-47a2-941a-76091eda8baa","Type":"ContainerDied","Data":"3a48546805a8f45bcf1ed08cf0c58a95dfdfdec94782808f3a0312479f13c214"} Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.089346 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcwq6" event={"ID":"20c9cd89-1c58-47a2-941a-76091eda8baa","Type":"ContainerStarted","Data":"a8bd199c720c7fc371659e96b2b66cadccd7c88d5d03c886fe3e3e776a7c26f7"} Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.093095 4711 generic.go:334] "Generic (PLEG): container finished" podID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerID="0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40" exitCode=0 Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.093215 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6q2z" event={"ID":"9638f6ab-2a76-4c44-9100-237e91ad4f6f","Type":"ContainerDied","Data":"0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40"} Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.101356 4711 generic.go:334] "Generic (PLEG): container finished" podID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerID="9c3c598e7eca6587c566f7e0049b78407ca1ced734472560545e594cee9f64ab" exitCode=0 Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.101438 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm8qz" event={"ID":"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff","Type":"ContainerDied","Data":"9c3c598e7eca6587c566f7e0049b78407ca1ced734472560545e594cee9f64ab"} Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.105778 4711 generic.go:334] "Generic (PLEG): container finished" podID="8b4e35ef-e95e-42b5-ae3d-f5f92bfec867" containerID="27229756c55b5be9ff28459f6406dbfa346e7be6ab2ea34bd5bf3932a20d3001" exitCode=0 Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.107494 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tltll" event={"ID":"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867","Type":"ContainerDied","Data":"27229756c55b5be9ff28459f6406dbfa346e7be6ab2ea34bd5bf3932a20d3001"} Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.107534 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tltll" event={"ID":"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867","Type":"ContainerStarted","Data":"5cd33b649f7b7c4bc787c9a22df697cfd49dec097adca79bc6d7fdbc2fe814fd"} Dec 03 12:28:43 crc kubenswrapper[4711]: I1203 12:28:43.118536 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7vzf"] Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.119796 4711 generic.go:334] "Generic (PLEG): container finished" podID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerID="d2d91c5f3f30421a3d8c040c3f6c7da136025984d3f9ed1cfc995fb4279d41b0" exitCode=0 Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.119877 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7vzf" event={"ID":"357500e9-9abf-48f2-a476-ccecf371bfbe","Type":"ContainerDied","Data":"d2d91c5f3f30421a3d8c040c3f6c7da136025984d3f9ed1cfc995fb4279d41b0"} Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.120299 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7vzf" event={"ID":"357500e9-9abf-48f2-a476-ccecf371bfbe","Type":"ContainerStarted","Data":"e97dce353c939ecc22565d2d062b620c79efa20bcf72167480b60e4dd2a151ee"} Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.124375 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6q2z" event={"ID":"9638f6ab-2a76-4c44-9100-237e91ad4f6f","Type":"ContainerStarted","Data":"521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6"} Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.126112 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm8qz" event={"ID":"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff","Type":"ContainerStarted","Data":"6919cd3853a00e24126a78258530be384a92bb3d4803412d598d77093d2e55be"} Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.163588 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n6q2z" podStartSLOduration=2.636068099 podStartE2EDuration="5.163564215s" podCreationTimestamp="2025-12-03 12:28:39 +0000 UTC" firstStartedPulling="2025-12-03 12:28:41.071759424 +0000 UTC m=+839.741010679" lastFinishedPulling="2025-12-03 12:28:43.59925554 +0000 UTC m=+842.268506795" observedRunningTime="2025-12-03 12:28:44.159326139 +0000 UTC m=+842.828577404" watchObservedRunningTime="2025-12-03 12:28:44.163564215 +0000 UTC m=+842.832815480" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.193993 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bm8qz" podStartSLOduration=2.660953671 podStartE2EDuration="5.19397654s" podCreationTimestamp="2025-12-03 12:28:39 +0000 UTC" firstStartedPulling="2025-12-03 12:28:41.051290001 +0000 UTC m=+839.720541266" lastFinishedPulling="2025-12-03 12:28:43.58431288 +0000 UTC m=+842.253564135" observedRunningTime="2025-12-03 12:28:44.190457984 +0000 UTC m=+842.859709259" watchObservedRunningTime="2025-12-03 12:28:44.19397654 +0000 UTC m=+842.863227795" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.404476 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gsnk6"] Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.405835 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.429607 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsnk6"] Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.567986 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888df056-de6f-4c3c-80b3-3029afdcfe85-catalog-content\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.568050 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888df056-de6f-4c3c-80b3-3029afdcfe85-utilities\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.568092 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4j5q\" (UniqueName: \"kubernetes.io/projected/888df056-de6f-4c3c-80b3-3029afdcfe85-kube-api-access-p4j5q\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.668887 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888df056-de6f-4c3c-80b3-3029afdcfe85-catalog-content\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.668979 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888df056-de6f-4c3c-80b3-3029afdcfe85-utilities\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.669013 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4j5q\" (UniqueName: \"kubernetes.io/projected/888df056-de6f-4c3c-80b3-3029afdcfe85-kube-api-access-p4j5q\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.669449 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888df056-de6f-4c3c-80b3-3029afdcfe85-catalog-content\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.669618 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888df056-de6f-4c3c-80b3-3029afdcfe85-utilities\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.697660 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4j5q\" (UniqueName: \"kubernetes.io/projected/888df056-de6f-4c3c-80b3-3029afdcfe85-kube-api-access-p4j5q\") pod \"certified-operators-gsnk6\" (UID: \"888df056-de6f-4c3c-80b3-3029afdcfe85\") " pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.722694 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:44 crc kubenswrapper[4711]: I1203 12:28:44.962330 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsnk6"] Dec 03 12:28:44 crc kubenswrapper[4711]: W1203 12:28:44.969353 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod888df056_de6f_4c3c_80b3_3029afdcfe85.slice/crio-29280abe74a209c2ff2edd1e0bedd61abf7321760db801ac58e95008eb649443 WatchSource:0}: Error finding container 29280abe74a209c2ff2edd1e0bedd61abf7321760db801ac58e95008eb649443: Status 404 returned error can't find the container with id 29280abe74a209c2ff2edd1e0bedd61abf7321760db801ac58e95008eb649443 Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.002125 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ccx4z"] Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.005653 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.032948 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ccx4z"] Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.134454 4711 generic.go:334] "Generic (PLEG): container finished" podID="de20138d-da5d-4975-b68e-cbd74b7be8cb" containerID="d21d700c8b6e571da4e0c5d6831919e509d1619f68d8ac2dba465f3b0bbcfcf5" exitCode=0 Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.134530 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqcm" event={"ID":"de20138d-da5d-4975-b68e-cbd74b7be8cb","Type":"ContainerDied","Data":"d21d700c8b6e571da4e0c5d6831919e509d1619f68d8ac2dba465f3b0bbcfcf5"} Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.136313 4711 generic.go:334] "Generic (PLEG): container finished" podID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerID="36ecb34f603231eeef40c4a770fd131cf81f4c0e02e065d29459416ff438c52d" exitCode=0 Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.136369 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcwq6" event={"ID":"20c9cd89-1c58-47a2-941a-76091eda8baa","Type":"ContainerDied","Data":"36ecb34f603231eeef40c4a770fd131cf81f4c0e02e065d29459416ff438c52d"} Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.145381 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnk6" event={"ID":"888df056-de6f-4c3c-80b3-3029afdcfe85","Type":"ContainerStarted","Data":"29280abe74a209c2ff2edd1e0bedd61abf7321760db801ac58e95008eb649443"} Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.153465 4711 generic.go:334] "Generic (PLEG): container finished" podID="8b4e35ef-e95e-42b5-ae3d-f5f92bfec867" containerID="1a45fb4e835741c946ac9110a8cb7f38fcc0937c284e495348f35dbe19855c17" exitCode=0 Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.154548 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tltll" event={"ID":"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867","Type":"ContainerDied","Data":"1a45fb4e835741c946ac9110a8cb7f38fcc0937c284e495348f35dbe19855c17"} Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.173739 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef97ed00-d8de-42fd-94ab-8ff1b1569291-utilities\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.173834 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqsgq\" (UniqueName: \"kubernetes.io/projected/ef97ed00-d8de-42fd-94ab-8ff1b1569291-kube-api-access-tqsgq\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.173876 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef97ed00-d8de-42fd-94ab-8ff1b1569291-catalog-content\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.285412 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef97ed00-d8de-42fd-94ab-8ff1b1569291-utilities\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.285860 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqsgq\" (UniqueName: \"kubernetes.io/projected/ef97ed00-d8de-42fd-94ab-8ff1b1569291-kube-api-access-tqsgq\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.285934 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef97ed00-d8de-42fd-94ab-8ff1b1569291-catalog-content\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.287369 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef97ed00-d8de-42fd-94ab-8ff1b1569291-utilities\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.288202 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef97ed00-d8de-42fd-94ab-8ff1b1569291-catalog-content\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.310392 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqsgq\" (UniqueName: \"kubernetes.io/projected/ef97ed00-d8de-42fd-94ab-8ff1b1569291-kube-api-access-tqsgq\") pod \"community-operators-ccx4z\" (UID: \"ef97ed00-d8de-42fd-94ab-8ff1b1569291\") " pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.367960 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:45 crc kubenswrapper[4711]: I1203 12:28:45.625420 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ccx4z"] Dec 03 12:28:45 crc kubenswrapper[4711]: W1203 12:28:45.638782 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef97ed00_d8de_42fd_94ab_8ff1b1569291.slice/crio-966556d0f28f3aa0b2d800805585e156b94b5e00490c1668e4ffd3d830bc67de WatchSource:0}: Error finding container 966556d0f28f3aa0b2d800805585e156b94b5e00490c1668e4ffd3d830bc67de: Status 404 returned error can't find the container with id 966556d0f28f3aa0b2d800805585e156b94b5e00490c1668e4ffd3d830bc67de Dec 03 12:28:46 crc kubenswrapper[4711]: I1203 12:28:46.161187 4711 generic.go:334] "Generic (PLEG): container finished" podID="888df056-de6f-4c3c-80b3-3029afdcfe85" containerID="7e9c02153ba803e9d048f7f0ce9078d894fb27fec887b1c346935da265e19ee0" exitCode=0 Dec 03 12:28:46 crc kubenswrapper[4711]: I1203 12:28:46.161356 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnk6" event={"ID":"888df056-de6f-4c3c-80b3-3029afdcfe85","Type":"ContainerDied","Data":"7e9c02153ba803e9d048f7f0ce9078d894fb27fec887b1c346935da265e19ee0"} Dec 03 12:28:46 crc kubenswrapper[4711]: I1203 12:28:46.164821 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccx4z" event={"ID":"ef97ed00-d8de-42fd-94ab-8ff1b1569291","Type":"ContainerStarted","Data":"29f87aaf656eec448f5e17693249b5db2e179b7e908b39f6b8764304dfec5e26"} Dec 03 12:28:46 crc kubenswrapper[4711]: I1203 12:28:46.164876 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccx4z" event={"ID":"ef97ed00-d8de-42fd-94ab-8ff1b1569291","Type":"ContainerStarted","Data":"966556d0f28f3aa0b2d800805585e156b94b5e00490c1668e4ffd3d830bc67de"} Dec 03 12:28:46 crc kubenswrapper[4711]: I1203 12:28:46.170129 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqcm" event={"ID":"de20138d-da5d-4975-b68e-cbd74b7be8cb","Type":"ContainerStarted","Data":"b4e14a2d8af4a8238f245e0a7f02815d74e5fe7b0bfc76ea1a5106c5f52a925d"} Dec 03 12:28:46 crc kubenswrapper[4711]: I1203 12:28:46.173115 4711 generic.go:334] "Generic (PLEG): container finished" podID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerID="a5ca73b2ada72e197c4f03e8bc2b913b3989db8f8b9e4a49af5684524927c93f" exitCode=0 Dec 03 12:28:46 crc kubenswrapper[4711]: I1203 12:28:46.173177 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7vzf" event={"ID":"357500e9-9abf-48f2-a476-ccecf371bfbe","Type":"ContainerDied","Data":"a5ca73b2ada72e197c4f03e8bc2b913b3989db8f8b9e4a49af5684524927c93f"} Dec 03 12:28:46 crc kubenswrapper[4711]: I1203 12:28:46.230768 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wrqcm" podStartSLOduration=2.510730637 podStartE2EDuration="5.230750224s" podCreationTimestamp="2025-12-03 12:28:41 +0000 UTC" firstStartedPulling="2025-12-03 12:28:43.087274412 +0000 UTC m=+841.756525667" lastFinishedPulling="2025-12-03 12:28:45.807293999 +0000 UTC m=+844.476545254" observedRunningTime="2025-12-03 12:28:46.227960727 +0000 UTC m=+844.897211992" watchObservedRunningTime="2025-12-03 12:28:46.230750224 +0000 UTC m=+844.900001479" Dec 03 12:28:47 crc kubenswrapper[4711]: I1203 12:28:47.182923 4711 generic.go:334] "Generic (PLEG): container finished" podID="ef97ed00-d8de-42fd-94ab-8ff1b1569291" containerID="29f87aaf656eec448f5e17693249b5db2e179b7e908b39f6b8764304dfec5e26" exitCode=0 Dec 03 12:28:47 crc kubenswrapper[4711]: I1203 12:28:47.183019 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccx4z" event={"ID":"ef97ed00-d8de-42fd-94ab-8ff1b1569291","Type":"ContainerDied","Data":"29f87aaf656eec448f5e17693249b5db2e179b7e908b39f6b8764304dfec5e26"} Dec 03 12:28:47 crc kubenswrapper[4711]: I1203 12:28:47.188021 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcwq6" event={"ID":"20c9cd89-1c58-47a2-941a-76091eda8baa","Type":"ContainerStarted","Data":"9e6b683e3f37d10c675eb06eb8e5d642d18e561bd70e805e13e0847102326e97"} Dec 03 12:28:47 crc kubenswrapper[4711]: I1203 12:28:47.195120 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tltll" event={"ID":"8b4e35ef-e95e-42b5-ae3d-f5f92bfec867","Type":"ContainerStarted","Data":"17badb3167512a83336c42dd62a1f5803c3afec40d76970c573d57d0313a370b"} Dec 03 12:28:47 crc kubenswrapper[4711]: I1203 12:28:47.229672 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kcwq6" podStartSLOduration=3.172931208 podStartE2EDuration="6.229641446s" podCreationTimestamp="2025-12-03 12:28:41 +0000 UTC" firstStartedPulling="2025-12-03 12:28:43.090538772 +0000 UTC m=+841.759790027" lastFinishedPulling="2025-12-03 12:28:46.14724901 +0000 UTC m=+844.816500265" observedRunningTime="2025-12-03 12:28:47.224920084 +0000 UTC m=+845.894171349" watchObservedRunningTime="2025-12-03 12:28:47.229641446 +0000 UTC m=+845.898892701" Dec 03 12:28:47 crc kubenswrapper[4711]: I1203 12:28:47.252015 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tltll" podStartSLOduration=2.067098959 podStartE2EDuration="5.251999399s" podCreationTimestamp="2025-12-03 12:28:42 +0000 UTC" firstStartedPulling="2025-12-03 12:28:43.108109505 +0000 UTC m=+841.777360760" lastFinishedPulling="2025-12-03 12:28:46.293009945 +0000 UTC m=+844.962261200" observedRunningTime="2025-12-03 12:28:47.248548673 +0000 UTC m=+845.917799938" watchObservedRunningTime="2025-12-03 12:28:47.251999399 +0000 UTC m=+845.921250654" Dec 03 12:28:48 crc kubenswrapper[4711]: I1203 12:28:48.201961 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7vzf" event={"ID":"357500e9-9abf-48f2-a476-ccecf371bfbe","Type":"ContainerStarted","Data":"f3397a1ed22170e5fe4f0bf1b6362f5b027f6474d7417263328658bc6de95a9a"} Dec 03 12:28:48 crc kubenswrapper[4711]: I1203 12:28:48.204431 4711 generic.go:334] "Generic (PLEG): container finished" podID="888df056-de6f-4c3c-80b3-3029afdcfe85" containerID="4fb2a1d074f7320ef6452cc54bcaf743f55a22b013d46797f59526fcf36d315d" exitCode=0 Dec 03 12:28:48 crc kubenswrapper[4711]: I1203 12:28:48.204492 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnk6" event={"ID":"888df056-de6f-4c3c-80b3-3029afdcfe85","Type":"ContainerDied","Data":"4fb2a1d074f7320ef6452cc54bcaf743f55a22b013d46797f59526fcf36d315d"} Dec 03 12:28:48 crc kubenswrapper[4711]: I1203 12:28:48.230810 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q7vzf" podStartSLOduration=3.331341475 podStartE2EDuration="6.230792938s" podCreationTimestamp="2025-12-03 12:28:42 +0000 UTC" firstStartedPulling="2025-12-03 12:28:44.128043009 +0000 UTC m=+842.797294264" lastFinishedPulling="2025-12-03 12:28:47.027494472 +0000 UTC m=+845.696745727" observedRunningTime="2025-12-03 12:28:48.226512779 +0000 UTC m=+846.895764044" watchObservedRunningTime="2025-12-03 12:28:48.230792938 +0000 UTC m=+846.900044193" Dec 03 12:28:49 crc kubenswrapper[4711]: I1203 12:28:49.212144 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccx4z" event={"ID":"ef97ed00-d8de-42fd-94ab-8ff1b1569291","Type":"ContainerStarted","Data":"94d6c478ac759335ec76160b1ca566ba6cf31f2741c4fb0d93fccbee6e6cbb9e"} Dec 03 12:28:49 crc kubenswrapper[4711]: I1203 12:28:49.214540 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnk6" event={"ID":"888df056-de6f-4c3c-80b3-3029afdcfe85","Type":"ContainerStarted","Data":"0355f3564aea7e7357fcb1ff3f728c90532f0ef3cf0022c9c3ed45a9f7ef0454"} Dec 03 12:28:49 crc kubenswrapper[4711]: I1203 12:28:49.254894 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gsnk6" podStartSLOduration=2.406815119 podStartE2EDuration="5.25485681s" podCreationTimestamp="2025-12-03 12:28:44 +0000 UTC" firstStartedPulling="2025-12-03 12:28:46.162560061 +0000 UTC m=+844.831811316" lastFinishedPulling="2025-12-03 12:28:49.010601762 +0000 UTC m=+847.679853007" observedRunningTime="2025-12-03 12:28:49.253047739 +0000 UTC m=+847.922299014" watchObservedRunningTime="2025-12-03 12:28:49.25485681 +0000 UTC m=+847.924108075" Dec 03 12:28:49 crc kubenswrapper[4711]: I1203 12:28:49.618470 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:49 crc kubenswrapper[4711]: I1203 12:28:49.618527 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:49 crc kubenswrapper[4711]: I1203 12:28:49.661474 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:50 crc kubenswrapper[4711]: I1203 12:28:50.177019 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:50 crc kubenswrapper[4711]: I1203 12:28:50.177373 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:50 crc kubenswrapper[4711]: I1203 12:28:50.219618 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:50 crc kubenswrapper[4711]: I1203 12:28:50.225334 4711 generic.go:334] "Generic (PLEG): container finished" podID="ef97ed00-d8de-42fd-94ab-8ff1b1569291" containerID="94d6c478ac759335ec76160b1ca566ba6cf31f2741c4fb0d93fccbee6e6cbb9e" exitCode=0 Dec 03 12:28:50 crc kubenswrapper[4711]: I1203 12:28:50.225369 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccx4z" event={"ID":"ef97ed00-d8de-42fd-94ab-8ff1b1569291","Type":"ContainerDied","Data":"94d6c478ac759335ec76160b1ca566ba6cf31f2741c4fb0d93fccbee6e6cbb9e"} Dec 03 12:28:50 crc kubenswrapper[4711]: I1203 12:28:50.276369 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:50 crc kubenswrapper[4711]: I1203 12:28:50.285269 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:51 crc kubenswrapper[4711]: I1203 12:28:51.936461 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:51 crc kubenswrapper[4711]: I1203 12:28:51.936848 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:51 crc kubenswrapper[4711]: I1203 12:28:51.999300 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.286053 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wrqcm" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.347397 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.347447 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.393800 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.542916 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.543286 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.586597 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.796056 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6q2z"] Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.796417 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n6q2z" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerName="registry-server" containerID="cri-o://521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6" gracePeriod=2 Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.936991 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.937056 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:52 crc kubenswrapper[4711]: I1203 12:28:52.988552 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.176436 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.243572 4711 generic.go:334] "Generic (PLEG): container finished" podID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerID="521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6" exitCode=0 Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.243627 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6q2z" event={"ID":"9638f6ab-2a76-4c44-9100-237e91ad4f6f","Type":"ContainerDied","Data":"521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6"} Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.243647 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6q2z" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.243678 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6q2z" event={"ID":"9638f6ab-2a76-4c44-9100-237e91ad4f6f","Type":"ContainerDied","Data":"ff1ed6e1014ddb28943cb2d494d94dbf141d7dbeda3857ae6dec823840339402"} Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.243698 4711 scope.go:117] "RemoveContainer" containerID="521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.246012 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccx4z" event={"ID":"ef97ed00-d8de-42fd-94ab-8ff1b1569291","Type":"ContainerStarted","Data":"7350897a0914dae93d18e10b0d00ae2ee93d50c3b978fbdef7629bd62493374a"} Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.260259 4711 scope.go:117] "RemoveContainer" containerID="0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.264210 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ccx4z" podStartSLOduration=3.75545523 podStartE2EDuration="9.264193232s" podCreationTimestamp="2025-12-03 12:28:44 +0000 UTC" firstStartedPulling="2025-12-03 12:28:47.185983788 +0000 UTC m=+845.855235053" lastFinishedPulling="2025-12-03 12:28:52.6947218 +0000 UTC m=+851.363973055" observedRunningTime="2025-12-03 12:28:53.264164531 +0000 UTC m=+851.933415816" watchObservedRunningTime="2025-12-03 12:28:53.264193232 +0000 UTC m=+851.933444487" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.293703 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4g44\" (UniqueName: \"kubernetes.io/projected/9638f6ab-2a76-4c44-9100-237e91ad4f6f-kube-api-access-h4g44\") pod \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.293783 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-catalog-content\") pod \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.293809 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-utilities\") pod \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\" (UID: \"9638f6ab-2a76-4c44-9100-237e91ad4f6f\") " Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.294578 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-utilities" (OuterVolumeSpecName: "utilities") pod "9638f6ab-2a76-4c44-9100-237e91ad4f6f" (UID: "9638f6ab-2a76-4c44-9100-237e91ad4f6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.295264 4711 scope.go:117] "RemoveContainer" containerID="879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.297491 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tltll" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.298895 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9638f6ab-2a76-4c44-9100-237e91ad4f6f-kube-api-access-h4g44" (OuterVolumeSpecName: "kube-api-access-h4g44") pod "9638f6ab-2a76-4c44-9100-237e91ad4f6f" (UID: "9638f6ab-2a76-4c44-9100-237e91ad4f6f"). InnerVolumeSpecName "kube-api-access-h4g44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.299718 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.306639 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.319675 4711 scope.go:117] "RemoveContainer" containerID="521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6" Dec 03 12:28:53 crc kubenswrapper[4711]: E1203 12:28:53.333529 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6\": container with ID starting with 521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6 not found: ID does not exist" containerID="521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.333596 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6"} err="failed to get container status \"521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6\": rpc error: code = NotFound desc = could not find container \"521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6\": container with ID starting with 521f9b1964c5a63c8c464ee6e4fa6196f3314338e3714454cc1d140e90aef4a6 not found: ID does not exist" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.333635 4711 scope.go:117] "RemoveContainer" containerID="0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40" Dec 03 12:28:53 crc kubenswrapper[4711]: E1203 12:28:53.334387 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40\": container with ID starting with 0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40 not found: ID does not exist" containerID="0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.334448 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40"} err="failed to get container status \"0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40\": rpc error: code = NotFound desc = could not find container \"0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40\": container with ID starting with 0b64c1ff2b94cbf9fec780896ee2fa70b9613bb8b9a11320f2904a6c322edd40 not found: ID does not exist" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.334492 4711 scope.go:117] "RemoveContainer" containerID="879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055" Dec 03 12:28:53 crc kubenswrapper[4711]: E1203 12:28:53.335171 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055\": container with ID starting with 879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055 not found: ID does not exist" containerID="879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.335219 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055"} err="failed to get container status \"879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055\": rpc error: code = NotFound desc = could not find container \"879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055\": container with ID starting with 879eaec703b77703d7074172cb633350f368f6ca2d2e1675c73a06802f0d7055 not found: ID does not exist" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.356609 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9638f6ab-2a76-4c44-9100-237e91ad4f6f" (UID: "9638f6ab-2a76-4c44-9100-237e91ad4f6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.399425 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bm8qz"] Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.399749 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bm8qz" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerName="registry-server" containerID="cri-o://6919cd3853a00e24126a78258530be384a92bb3d4803412d598d77093d2e55be" gracePeriod=2 Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.399781 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4g44\" (UniqueName: \"kubernetes.io/projected/9638f6ab-2a76-4c44-9100-237e91ad4f6f-kube-api-access-h4g44\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.399822 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.399837 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9638f6ab-2a76-4c44-9100-237e91ad4f6f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.587532 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6q2z"] Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.593796 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n6q2z"] Dec 03 12:28:53 crc kubenswrapper[4711]: I1203 12:28:53.824399 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" path="/var/lib/kubelet/pods/9638f6ab-2a76-4c44-9100-237e91ad4f6f/volumes" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.257778 4711 generic.go:334] "Generic (PLEG): container finished" podID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerID="6919cd3853a00e24126a78258530be384a92bb3d4803412d598d77093d2e55be" exitCode=0 Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.257935 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm8qz" event={"ID":"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff","Type":"ContainerDied","Data":"6919cd3853a00e24126a78258530be384a92bb3d4803412d598d77093d2e55be"} Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.257964 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm8qz" event={"ID":"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff","Type":"ContainerDied","Data":"2bf6bd7b72f70e420d643e08f29de22bbaf2d26423cbf0417529a7343a15f453"} Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.257986 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bf6bd7b72f70e420d643e08f29de22bbaf2d26423cbf0417529a7343a15f453" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.272463 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.414043 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmvs\" (UniqueName: \"kubernetes.io/projected/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-kube-api-access-dnmvs\") pod \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.414113 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-catalog-content\") pod \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.414175 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-utilities\") pod \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\" (UID: \"bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff\") " Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.415208 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-utilities" (OuterVolumeSpecName: "utilities") pod "bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" (UID: "bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.422188 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-kube-api-access-dnmvs" (OuterVolumeSpecName: "kube-api-access-dnmvs") pod "bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" (UID: "bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff"). InnerVolumeSpecName "kube-api-access-dnmvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.471245 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" (UID: "bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.516091 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmvs\" (UniqueName: \"kubernetes.io/projected/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-kube-api-access-dnmvs\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.516130 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.516144 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.723449 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.723501 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:54 crc kubenswrapper[4711]: I1203 12:28:54.779398 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.191852 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcwq6"] Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.262980 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm8qz" Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.263110 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kcwq6" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerName="registry-server" containerID="cri-o://9e6b683e3f37d10c675eb06eb8e5d642d18e561bd70e805e13e0847102326e97" gracePeriod=2 Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.295081 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bm8qz"] Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.299697 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bm8qz"] Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.309468 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gsnk6" Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.375096 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.375152 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.412881 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.794964 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7vzf"] Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.795319 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q7vzf" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerName="registry-server" containerID="cri-o://f3397a1ed22170e5fe4f0bf1b6362f5b027f6474d7417263328658bc6de95a9a" gracePeriod=2 Dec 03 12:28:55 crc kubenswrapper[4711]: I1203 12:28:55.825868 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" path="/var/lib/kubelet/pods/bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff/volumes" Dec 03 12:28:56 crc kubenswrapper[4711]: I1203 12:28:56.274194 4711 generic.go:334] "Generic (PLEG): container finished" podID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerID="9e6b683e3f37d10c675eb06eb8e5d642d18e561bd70e805e13e0847102326e97" exitCode=0 Dec 03 12:28:56 crc kubenswrapper[4711]: I1203 12:28:56.274271 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcwq6" event={"ID":"20c9cd89-1c58-47a2-941a-76091eda8baa","Type":"ContainerDied","Data":"9e6b683e3f37d10c675eb06eb8e5d642d18e561bd70e805e13e0847102326e97"} Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.284262 4711 generic.go:334] "Generic (PLEG): container finished" podID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerID="f3397a1ed22170e5fe4f0bf1b6362f5b027f6474d7417263328658bc6de95a9a" exitCode=0 Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.284328 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7vzf" event={"ID":"357500e9-9abf-48f2-a476-ccecf371bfbe","Type":"ContainerDied","Data":"f3397a1ed22170e5fe4f0bf1b6362f5b027f6474d7417263328658bc6de95a9a"} Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.478283 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.654371 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-utilities\") pod \"20c9cd89-1c58-47a2-941a-76091eda8baa\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.654560 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-catalog-content\") pod \"20c9cd89-1c58-47a2-941a-76091eda8baa\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.654640 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5scc\" (UniqueName: \"kubernetes.io/projected/20c9cd89-1c58-47a2-941a-76091eda8baa-kube-api-access-w5scc\") pod \"20c9cd89-1c58-47a2-941a-76091eda8baa\" (UID: \"20c9cd89-1c58-47a2-941a-76091eda8baa\") " Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.655455 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-utilities" (OuterVolumeSpecName: "utilities") pod "20c9cd89-1c58-47a2-941a-76091eda8baa" (UID: "20c9cd89-1c58-47a2-941a-76091eda8baa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.663330 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c9cd89-1c58-47a2-941a-76091eda8baa-kube-api-access-w5scc" (OuterVolumeSpecName: "kube-api-access-w5scc") pod "20c9cd89-1c58-47a2-941a-76091eda8baa" (UID: "20c9cd89-1c58-47a2-941a-76091eda8baa"). InnerVolumeSpecName "kube-api-access-w5scc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.684152 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20c9cd89-1c58-47a2-941a-76091eda8baa" (UID: "20c9cd89-1c58-47a2-941a-76091eda8baa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.756010 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.756059 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5scc\" (UniqueName: \"kubernetes.io/projected/20c9cd89-1c58-47a2-941a-76091eda8baa-kube-api-access-w5scc\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.756074 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c9cd89-1c58-47a2-941a-76091eda8baa-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:57 crc kubenswrapper[4711]: I1203 12:28:57.973655 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.160856 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-catalog-content\") pod \"357500e9-9abf-48f2-a476-ccecf371bfbe\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.160939 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-utilities\") pod \"357500e9-9abf-48f2-a476-ccecf371bfbe\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.161014 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd4zc\" (UniqueName: \"kubernetes.io/projected/357500e9-9abf-48f2-a476-ccecf371bfbe-kube-api-access-dd4zc\") pod \"357500e9-9abf-48f2-a476-ccecf371bfbe\" (UID: \"357500e9-9abf-48f2-a476-ccecf371bfbe\") " Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.162226 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-utilities" (OuterVolumeSpecName: "utilities") pod "357500e9-9abf-48f2-a476-ccecf371bfbe" (UID: "357500e9-9abf-48f2-a476-ccecf371bfbe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.165173 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357500e9-9abf-48f2-a476-ccecf371bfbe-kube-api-access-dd4zc" (OuterVolumeSpecName: "kube-api-access-dd4zc") pod "357500e9-9abf-48f2-a476-ccecf371bfbe" (UID: "357500e9-9abf-48f2-a476-ccecf371bfbe"). InnerVolumeSpecName "kube-api-access-dd4zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.262241 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.262285 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd4zc\" (UniqueName: \"kubernetes.io/projected/357500e9-9abf-48f2-a476-ccecf371bfbe-kube-api-access-dd4zc\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.292822 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7vzf" event={"ID":"357500e9-9abf-48f2-a476-ccecf371bfbe","Type":"ContainerDied","Data":"e97dce353c939ecc22565d2d062b620c79efa20bcf72167480b60e4dd2a151ee"} Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.292845 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7vzf" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.292877 4711 scope.go:117] "RemoveContainer" containerID="f3397a1ed22170e5fe4f0bf1b6362f5b027f6474d7417263328658bc6de95a9a" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.296336 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "357500e9-9abf-48f2-a476-ccecf371bfbe" (UID: "357500e9-9abf-48f2-a476-ccecf371bfbe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.299064 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcwq6" event={"ID":"20c9cd89-1c58-47a2-941a-76091eda8baa","Type":"ContainerDied","Data":"a8bd199c720c7fc371659e96b2b66cadccd7c88d5d03c886fe3e3e776a7c26f7"} Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.299164 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcwq6" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.314474 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcwq6"] Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.318156 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcwq6"] Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.320449 4711 scope.go:117] "RemoveContainer" containerID="a5ca73b2ada72e197c4f03e8bc2b913b3989db8f8b9e4a49af5684524927c93f" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.333633 4711 scope.go:117] "RemoveContainer" containerID="d2d91c5f3f30421a3d8c040c3f6c7da136025984d3f9ed1cfc995fb4279d41b0" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.345601 4711 scope.go:117] "RemoveContainer" containerID="9e6b683e3f37d10c675eb06eb8e5d642d18e561bd70e805e13e0847102326e97" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.360747 4711 scope.go:117] "RemoveContainer" containerID="36ecb34f603231eeef40c4a770fd131cf81f4c0e02e065d29459416ff438c52d" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.363453 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/357500e9-9abf-48f2-a476-ccecf371bfbe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.375108 4711 scope.go:117] "RemoveContainer" containerID="3a48546805a8f45bcf1ed08cf0c58a95dfdfdec94782808f3a0312479f13c214" Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.634812 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7vzf"] Dec 03 12:28:58 crc kubenswrapper[4711]: I1203 12:28:58.643183 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q7vzf"] Dec 03 12:28:59 crc kubenswrapper[4711]: I1203 12:28:59.825196 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" path="/var/lib/kubelet/pods/20c9cd89-1c58-47a2-941a-76091eda8baa/volumes" Dec 03 12:28:59 crc kubenswrapper[4711]: I1203 12:28:59.826407 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" path="/var/lib/kubelet/pods/357500e9-9abf-48f2-a476-ccecf371bfbe/volumes" Dec 03 12:29:05 crc kubenswrapper[4711]: I1203 12:29:05.426151 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ccx4z" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.182145 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9"] Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.182926 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerName="extract-content" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.182944 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerName="extract-content" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.182961 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerName="extract-utilities" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.182970 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerName="extract-utilities" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.182983 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerName="extract-content" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.182991 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerName="extract-content" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183003 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerName="extract-content" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183012 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerName="extract-content" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183023 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183031 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183048 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerName="extract-content" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183057 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerName="extract-content" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183074 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerName="extract-utilities" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183082 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerName="extract-utilities" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183092 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerName="extract-utilities" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183100 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerName="extract-utilities" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183116 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183126 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183142 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerName="extract-utilities" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183152 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerName="extract-utilities" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183167 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183177 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: E1203 12:30:00.183193 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183204 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183345 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="9638f6ab-2a76-4c44-9100-237e91ad4f6f" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183363 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="357500e9-9abf-48f2-a476-ccecf371bfbe" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183374 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c9cd89-1c58-47a2-941a-76091eda8baa" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183389 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb8fdb7-5812-4474-b2dc-31f00bbbd1ff" containerName="registry-server" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.183989 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.187998 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.189025 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.190101 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9"] Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.278115 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp7pb\" (UniqueName: \"kubernetes.io/projected/17b60f93-36af-41dc-98bd-2b497c664296-kube-api-access-dp7pb\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.278437 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17b60f93-36af-41dc-98bd-2b497c664296-secret-volume\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.278536 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17b60f93-36af-41dc-98bd-2b497c664296-config-volume\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.379988 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp7pb\" (UniqueName: \"kubernetes.io/projected/17b60f93-36af-41dc-98bd-2b497c664296-kube-api-access-dp7pb\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.380132 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17b60f93-36af-41dc-98bd-2b497c664296-secret-volume\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.380172 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17b60f93-36af-41dc-98bd-2b497c664296-config-volume\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.381271 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17b60f93-36af-41dc-98bd-2b497c664296-config-volume\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.389444 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17b60f93-36af-41dc-98bd-2b497c664296-secret-volume\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.398091 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp7pb\" (UniqueName: \"kubernetes.io/projected/17b60f93-36af-41dc-98bd-2b497c664296-kube-api-access-dp7pb\") pod \"collect-profiles-29412750-2nxz9\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.502637 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:00 crc kubenswrapper[4711]: I1203 12:30:00.690972 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9"] Dec 03 12:30:01 crc kubenswrapper[4711]: I1203 12:30:01.698042 4711 generic.go:334] "Generic (PLEG): container finished" podID="17b60f93-36af-41dc-98bd-2b497c664296" containerID="f8d095fd2565e0bb195a0dca7acbbf609a0042405a1634a686469c2a5d3a0910" exitCode=0 Dec 03 12:30:01 crc kubenswrapper[4711]: I1203 12:30:01.698143 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" event={"ID":"17b60f93-36af-41dc-98bd-2b497c664296","Type":"ContainerDied","Data":"f8d095fd2565e0bb195a0dca7acbbf609a0042405a1634a686469c2a5d3a0910"} Dec 03 12:30:01 crc kubenswrapper[4711]: I1203 12:30:01.698335 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" event={"ID":"17b60f93-36af-41dc-98bd-2b497c664296","Type":"ContainerStarted","Data":"7f74ccd913939c70f673d458744b2d450b813e88968e85c0f57a1117f2746dc2"} Dec 03 12:30:02 crc kubenswrapper[4711]: I1203 12:30:02.935164 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.010504 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp7pb\" (UniqueName: \"kubernetes.io/projected/17b60f93-36af-41dc-98bd-2b497c664296-kube-api-access-dp7pb\") pod \"17b60f93-36af-41dc-98bd-2b497c664296\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.010650 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17b60f93-36af-41dc-98bd-2b497c664296-config-volume\") pod \"17b60f93-36af-41dc-98bd-2b497c664296\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.010691 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17b60f93-36af-41dc-98bd-2b497c664296-secret-volume\") pod \"17b60f93-36af-41dc-98bd-2b497c664296\" (UID: \"17b60f93-36af-41dc-98bd-2b497c664296\") " Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.011471 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b60f93-36af-41dc-98bd-2b497c664296-config-volume" (OuterVolumeSpecName: "config-volume") pod "17b60f93-36af-41dc-98bd-2b497c664296" (UID: "17b60f93-36af-41dc-98bd-2b497c664296"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.016063 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17b60f93-36af-41dc-98bd-2b497c664296-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "17b60f93-36af-41dc-98bd-2b497c664296" (UID: "17b60f93-36af-41dc-98bd-2b497c664296"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.016337 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b60f93-36af-41dc-98bd-2b497c664296-kube-api-access-dp7pb" (OuterVolumeSpecName: "kube-api-access-dp7pb") pod "17b60f93-36af-41dc-98bd-2b497c664296" (UID: "17b60f93-36af-41dc-98bd-2b497c664296"). InnerVolumeSpecName "kube-api-access-dp7pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.112724 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp7pb\" (UniqueName: \"kubernetes.io/projected/17b60f93-36af-41dc-98bd-2b497c664296-kube-api-access-dp7pb\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.112758 4711 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17b60f93-36af-41dc-98bd-2b497c664296-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.112767 4711 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17b60f93-36af-41dc-98bd-2b497c664296-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.713162 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" event={"ID":"17b60f93-36af-41dc-98bd-2b497c664296","Type":"ContainerDied","Data":"7f74ccd913939c70f673d458744b2d450b813e88968e85c0f57a1117f2746dc2"} Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.713197 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-2nxz9" Dec 03 12:30:03 crc kubenswrapper[4711]: I1203 12:30:03.713208 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f74ccd913939c70f673d458744b2d450b813e88968e85c0f57a1117f2746dc2" Dec 03 12:30:05 crc kubenswrapper[4711]: I1203 12:30:05.401727 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:30:05 crc kubenswrapper[4711]: I1203 12:30:05.402127 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:30:30 crc kubenswrapper[4711]: I1203 12:30:30.964986 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtc2k"] Dec 03 12:30:30 crc kubenswrapper[4711]: I1203 12:30:30.965884 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" podUID="25c69a31-2a54-4077-9b8c-c859d1d20849" containerName="controller-manager" containerID="cri-o://bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89" gracePeriod=30 Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.065005 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd"] Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.065232 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" podUID="a0e8a48c-f60d-494e-8928-061c3235d3f1" containerName="route-controller-manager" containerID="cri-o://43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686" gracePeriod=30 Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.350393 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.395211 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498621 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c69a31-2a54-4077-9b8c-c859d1d20849-serving-cert\") pod \"25c69a31-2a54-4077-9b8c-c859d1d20849\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498716 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e8a48c-f60d-494e-8928-061c3235d3f1-serving-cert\") pod \"a0e8a48c-f60d-494e-8928-061c3235d3f1\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498743 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9krlj\" (UniqueName: \"kubernetes.io/projected/25c69a31-2a54-4077-9b8c-c859d1d20849-kube-api-access-9krlj\") pod \"25c69a31-2a54-4077-9b8c-c859d1d20849\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498772 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-client-ca\") pod \"25c69a31-2a54-4077-9b8c-c859d1d20849\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498795 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-config\") pod \"a0e8a48c-f60d-494e-8928-061c3235d3f1\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498832 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-proxy-ca-bundles\") pod \"25c69a31-2a54-4077-9b8c-c859d1d20849\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498857 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-config\") pod \"25c69a31-2a54-4077-9b8c-c859d1d20849\" (UID: \"25c69a31-2a54-4077-9b8c-c859d1d20849\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498881 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdlq8\" (UniqueName: \"kubernetes.io/projected/a0e8a48c-f60d-494e-8928-061c3235d3f1-kube-api-access-wdlq8\") pod \"a0e8a48c-f60d-494e-8928-061c3235d3f1\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.498901 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-client-ca\") pod \"a0e8a48c-f60d-494e-8928-061c3235d3f1\" (UID: \"a0e8a48c-f60d-494e-8928-061c3235d3f1\") " Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.499705 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "25c69a31-2a54-4077-9b8c-c859d1d20849" (UID: "25c69a31-2a54-4077-9b8c-c859d1d20849"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.499713 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-config" (OuterVolumeSpecName: "config") pod "25c69a31-2a54-4077-9b8c-c859d1d20849" (UID: "25c69a31-2a54-4077-9b8c-c859d1d20849"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.499728 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "a0e8a48c-f60d-494e-8928-061c3235d3f1" (UID: "a0e8a48c-f60d-494e-8928-061c3235d3f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.499815 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-config" (OuterVolumeSpecName: "config") pod "a0e8a48c-f60d-494e-8928-061c3235d3f1" (UID: "a0e8a48c-f60d-494e-8928-061c3235d3f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.499950 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-client-ca" (OuterVolumeSpecName: "client-ca") pod "25c69a31-2a54-4077-9b8c-c859d1d20849" (UID: "25c69a31-2a54-4077-9b8c-c859d1d20849"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.504286 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c69a31-2a54-4077-9b8c-c859d1d20849-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25c69a31-2a54-4077-9b8c-c859d1d20849" (UID: "25c69a31-2a54-4077-9b8c-c859d1d20849"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.504343 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e8a48c-f60d-494e-8928-061c3235d3f1-kube-api-access-wdlq8" (OuterVolumeSpecName: "kube-api-access-wdlq8") pod "a0e8a48c-f60d-494e-8928-061c3235d3f1" (UID: "a0e8a48c-f60d-494e-8928-061c3235d3f1"). InnerVolumeSpecName "kube-api-access-wdlq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.504850 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c69a31-2a54-4077-9b8c-c859d1d20849-kube-api-access-9krlj" (OuterVolumeSpecName: "kube-api-access-9krlj") pod "25c69a31-2a54-4077-9b8c-c859d1d20849" (UID: "25c69a31-2a54-4077-9b8c-c859d1d20849"). InnerVolumeSpecName "kube-api-access-9krlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.506432 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e8a48c-f60d-494e-8928-061c3235d3f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a0e8a48c-f60d-494e-8928-061c3235d3f1" (UID: "a0e8a48c-f60d-494e-8928-061c3235d3f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600305 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e8a48c-f60d-494e-8928-061c3235d3f1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600363 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9krlj\" (UniqueName: \"kubernetes.io/projected/25c69a31-2a54-4077-9b8c-c859d1d20849-kube-api-access-9krlj\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600383 4711 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600403 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600420 4711 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600437 4711 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c69a31-2a54-4077-9b8c-c859d1d20849-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600453 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdlq8\" (UniqueName: \"kubernetes.io/projected/a0e8a48c-f60d-494e-8928-061c3235d3f1-kube-api-access-wdlq8\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600470 4711 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0e8a48c-f60d-494e-8928-061c3235d3f1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.600485 4711 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c69a31-2a54-4077-9b8c-c859d1d20849-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.884567 4711 generic.go:334] "Generic (PLEG): container finished" podID="a0e8a48c-f60d-494e-8928-061c3235d3f1" containerID="43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686" exitCode=0 Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.884631 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.884647 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" event={"ID":"a0e8a48c-f60d-494e-8928-061c3235d3f1","Type":"ContainerDied","Data":"43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686"} Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.884680 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd" event={"ID":"a0e8a48c-f60d-494e-8928-061c3235d3f1","Type":"ContainerDied","Data":"2ffcfcf90ca529fbcec157e6a9f7884d4443d896b1a2284af811dc267cb01332"} Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.884701 4711 scope.go:117] "RemoveContainer" containerID="43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.887534 4711 generic.go:334] "Generic (PLEG): container finished" podID="25c69a31-2a54-4077-9b8c-c859d1d20849" containerID="bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89" exitCode=0 Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.887570 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" event={"ID":"25c69a31-2a54-4077-9b8c-c859d1d20849","Type":"ContainerDied","Data":"bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89"} Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.887712 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.887593 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtc2k" event={"ID":"25c69a31-2a54-4077-9b8c-c859d1d20849","Type":"ContainerDied","Data":"531619ce199a4bd5a6ed146c954693cda7dad75deb1fd460c4562fa41947d4d6"} Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.907879 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtc2k"] Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.912456 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtc2k"] Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.915327 4711 scope.go:117] "RemoveContainer" containerID="43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686" Dec 03 12:30:31 crc kubenswrapper[4711]: E1203 12:30:31.915868 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686\": container with ID starting with 43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686 not found: ID does not exist" containerID="43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.915900 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686"} err="failed to get container status \"43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686\": rpc error: code = NotFound desc = could not find container \"43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686\": container with ID starting with 43ad24bccc0cfc99c100c4a3eb26ea0beefafd37164ea5e5f642992b834d3686 not found: ID does not exist" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.915934 4711 scope.go:117] "RemoveContainer" containerID="bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.921587 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd"] Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.925142 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7trmd"] Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.936184 4711 scope.go:117] "RemoveContainer" containerID="bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89" Dec 03 12:30:31 crc kubenswrapper[4711]: E1203 12:30:31.936858 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89\": container with ID starting with bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89 not found: ID does not exist" containerID="bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89" Dec 03 12:30:31 crc kubenswrapper[4711]: I1203 12:30:31.936934 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89"} err="failed to get container status \"bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89\": rpc error: code = NotFound desc = could not find container \"bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89\": container with ID starting with bac514b78cfebca1d63fa1060ef3aa1e7578bb00cf6cafdca24d1bcf3cd95f89 not found: ID does not exist" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.152589 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c4d958575-vt8z9"] Dec 03 12:30:32 crc kubenswrapper[4711]: E1203 12:30:32.154005 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c69a31-2a54-4077-9b8c-c859d1d20849" containerName="controller-manager" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.154200 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c69a31-2a54-4077-9b8c-c859d1d20849" containerName="controller-manager" Dec 03 12:30:32 crc kubenswrapper[4711]: E1203 12:30:32.154332 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b60f93-36af-41dc-98bd-2b497c664296" containerName="collect-profiles" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.154416 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b60f93-36af-41dc-98bd-2b497c664296" containerName="collect-profiles" Dec 03 12:30:32 crc kubenswrapper[4711]: E1203 12:30:32.154507 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e8a48c-f60d-494e-8928-061c3235d3f1" containerName="route-controller-manager" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.154587 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e8a48c-f60d-494e-8928-061c3235d3f1" containerName="route-controller-manager" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.154792 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c69a31-2a54-4077-9b8c-c859d1d20849" containerName="controller-manager" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.154894 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e8a48c-f60d-494e-8928-061c3235d3f1" containerName="route-controller-manager" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.155019 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b60f93-36af-41dc-98bd-2b497c664296" containerName="collect-profiles" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.155492 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.158101 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c4d958575-vt8z9"] Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.158961 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.159136 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.159184 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.159757 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.161072 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.161163 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.166577 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.308292 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-client-ca\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.308536 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-config\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.308772 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e721633-5c7a-4912-992b-e5c6b6bf57af-serving-cert\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.308840 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-proxy-ca-bundles\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.308919 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2r2\" (UniqueName: \"kubernetes.io/projected/1e721633-5c7a-4912-992b-e5c6b6bf57af-kube-api-access-wk2r2\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.387589 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm"] Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.388210 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.390236 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.390584 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.390684 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.390722 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.390956 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.393754 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.399216 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm"] Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.410591 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e721633-5c7a-4912-992b-e5c6b6bf57af-serving-cert\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.411024 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-proxy-ca-bundles\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.411188 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2r2\" (UniqueName: \"kubernetes.io/projected/1e721633-5c7a-4912-992b-e5c6b6bf57af-kube-api-access-wk2r2\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.411418 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-client-ca\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.411592 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-config\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.414670 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-config\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.419158 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-client-ca\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.420529 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e721633-5c7a-4912-992b-e5c6b6bf57af-proxy-ca-bundles\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.421390 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e721633-5c7a-4912-992b-e5c6b6bf57af-serving-cert\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.434114 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2r2\" (UniqueName: \"kubernetes.io/projected/1e721633-5c7a-4912-992b-e5c6b6bf57af-kube-api-access-wk2r2\") pod \"controller-manager-5c4d958575-vt8z9\" (UID: \"1e721633-5c7a-4912-992b-e5c6b6bf57af\") " pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.470153 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.512779 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-config\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.513109 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2pqd\" (UniqueName: \"kubernetes.io/projected/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-kube-api-access-p2pqd\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.513245 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-client-ca\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.513394 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-serving-cert\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.614647 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2pqd\" (UniqueName: \"kubernetes.io/projected/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-kube-api-access-p2pqd\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.615066 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-client-ca\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.615128 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-serving-cert\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.615158 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-config\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.616237 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-config\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.617263 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-client-ca\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.619528 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-serving-cert\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.635695 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2pqd\" (UniqueName: \"kubernetes.io/projected/0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f-kube-api-access-p2pqd\") pod \"route-controller-manager-945b8bd79-jt2lm\" (UID: \"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f\") " pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.655147 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c4d958575-vt8z9"] Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.704374 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.900513 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" event={"ID":"1e721633-5c7a-4912-992b-e5c6b6bf57af","Type":"ContainerStarted","Data":"4235b39641a4435d17302976c4e9b8f00f043c637b488119288d69200adc6a68"} Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.900569 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" event={"ID":"1e721633-5c7a-4912-992b-e5c6b6bf57af","Type":"ContainerStarted","Data":"c850458504756840795ebbf68ad552e51076d44ad10621cd0b3e02373e3512a6"} Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.901568 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.920808 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.923243 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm"] Dec 03 12:30:32 crc kubenswrapper[4711]: I1203 12:30:32.925451 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c4d958575-vt8z9" podStartSLOduration=0.925441187 podStartE2EDuration="925.441187ms" podCreationTimestamp="2025-12-03 12:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:30:32.915722946 +0000 UTC m=+951.584974211" watchObservedRunningTime="2025-12-03 12:30:32.925441187 +0000 UTC m=+951.594692442" Dec 03 12:30:33 crc kubenswrapper[4711]: I1203 12:30:33.825019 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c69a31-2a54-4077-9b8c-c859d1d20849" path="/var/lib/kubelet/pods/25c69a31-2a54-4077-9b8c-c859d1d20849/volumes" Dec 03 12:30:33 crc kubenswrapper[4711]: I1203 12:30:33.826232 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e8a48c-f60d-494e-8928-061c3235d3f1" path="/var/lib/kubelet/pods/a0e8a48c-f60d-494e-8928-061c3235d3f1/volumes" Dec 03 12:30:33 crc kubenswrapper[4711]: I1203 12:30:33.916844 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" event={"ID":"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f","Type":"ContainerStarted","Data":"428fb8df7550712921afcb6a8a7bdd4be0a0c69d8580d3faafae13b3e1dfc750"} Dec 03 12:30:33 crc kubenswrapper[4711]: I1203 12:30:33.916900 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" event={"ID":"0f526a2e-cd0d-4c48-b79f-f9d20a30cf4f","Type":"ContainerStarted","Data":"0a5ce1ee3a4c8d3bfa628d46b7f857df0e329579fd9c3207258b251c546a7a52"} Dec 03 12:30:33 crc kubenswrapper[4711]: I1203 12:30:33.936213 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" podStartSLOduration=2.936195597 podStartE2EDuration="2.936195597s" podCreationTimestamp="2025-12-03 12:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:30:33.934105939 +0000 UTC m=+952.603357214" watchObservedRunningTime="2025-12-03 12:30:33.936195597 +0000 UTC m=+952.605446852" Dec 03 12:30:34 crc kubenswrapper[4711]: I1203 12:30:34.922966 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:34 crc kubenswrapper[4711]: I1203 12:30:34.927553 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-945b8bd79-jt2lm" Dec 03 12:30:35 crc kubenswrapper[4711]: I1203 12:30:35.401265 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:30:35 crc kubenswrapper[4711]: I1203 12:30:35.401561 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:31:05 crc kubenswrapper[4711]: I1203 12:31:05.401578 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:31:05 crc kubenswrapper[4711]: I1203 12:31:05.402295 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:31:05 crc kubenswrapper[4711]: I1203 12:31:05.402352 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:31:05 crc kubenswrapper[4711]: I1203 12:31:05.403009 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba490530ff3c59515e7bc84e7f01736c013e66e1dc8f01fbdfa3312d40c3ab5f"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:31:05 crc kubenswrapper[4711]: I1203 12:31:05.403084 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://ba490530ff3c59515e7bc84e7f01736c013e66e1dc8f01fbdfa3312d40c3ab5f" gracePeriod=600 Dec 03 12:31:07 crc kubenswrapper[4711]: I1203 12:31:07.108062 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="ba490530ff3c59515e7bc84e7f01736c013e66e1dc8f01fbdfa3312d40c3ab5f" exitCode=0 Dec 03 12:31:07 crc kubenswrapper[4711]: I1203 12:31:07.108126 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"ba490530ff3c59515e7bc84e7f01736c013e66e1dc8f01fbdfa3312d40c3ab5f"} Dec 03 12:31:07 crc kubenswrapper[4711]: I1203 12:31:07.108415 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"9c8244cd04cab97f7bcedd65d783655e34619fe118586bb3e610a526227c7382"} Dec 03 12:31:07 crc kubenswrapper[4711]: I1203 12:31:07.108436 4711 scope.go:117] "RemoveContainer" containerID="353bb382f540c9af7fa95771c36b38b3219d1db4c595540997493e382b5d1428" Dec 03 12:33:35 crc kubenswrapper[4711]: I1203 12:33:35.401977 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:33:35 crc kubenswrapper[4711]: I1203 12:33:35.403459 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:34:05 crc kubenswrapper[4711]: I1203 12:34:05.402348 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:34:05 crc kubenswrapper[4711]: I1203 12:34:05.403482 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:34:35 crc kubenswrapper[4711]: I1203 12:34:35.401830 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:34:35 crc kubenswrapper[4711]: I1203 12:34:35.402981 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:34:35 crc kubenswrapper[4711]: I1203 12:34:35.403081 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:34:35 crc kubenswrapper[4711]: I1203 12:34:35.404250 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c8244cd04cab97f7bcedd65d783655e34619fe118586bb3e610a526227c7382"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:34:35 crc kubenswrapper[4711]: I1203 12:34:35.404327 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://9c8244cd04cab97f7bcedd65d783655e34619fe118586bb3e610a526227c7382" gracePeriod=600 Dec 03 12:34:36 crc kubenswrapper[4711]: I1203 12:34:36.321565 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="9c8244cd04cab97f7bcedd65d783655e34619fe118586bb3e610a526227c7382" exitCode=0 Dec 03 12:34:36 crc kubenswrapper[4711]: I1203 12:34:36.321653 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"9c8244cd04cab97f7bcedd65d783655e34619fe118586bb3e610a526227c7382"} Dec 03 12:34:36 crc kubenswrapper[4711]: I1203 12:34:36.322229 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"c2221b1d8dd35869dac84859caa67252140cee66906c26379e7532c6528f6458"} Dec 03 12:34:36 crc kubenswrapper[4711]: I1203 12:34:36.322266 4711 scope.go:117] "RemoveContainer" containerID="ba490530ff3c59515e7bc84e7f01736c013e66e1dc8f01fbdfa3312d40c3ab5f" Dec 03 12:34:42 crc kubenswrapper[4711]: I1203 12:34:42.208728 4711 scope.go:117] "RemoveContainer" containerID="c0a536fe61d5267a2e1f614a204accc0c74993e73d186240f154d39d4a93d2ac" Dec 03 12:35:42 crc kubenswrapper[4711]: I1203 12:35:42.247898 4711 scope.go:117] "RemoveContainer" containerID="9c3c598e7eca6587c566f7e0049b78407ca1ced734472560545e594cee9f64ab" Dec 03 12:35:42 crc kubenswrapper[4711]: I1203 12:35:42.273193 4711 scope.go:117] "RemoveContainer" containerID="6919cd3853a00e24126a78258530be384a92bb3d4803412d598d77093d2e55be" Dec 03 12:36:35 crc kubenswrapper[4711]: I1203 12:36:35.401432 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:36:35 crc kubenswrapper[4711]: I1203 12:36:35.403078 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:37:05 crc kubenswrapper[4711]: I1203 12:37:05.401697 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:37:05 crc kubenswrapper[4711]: I1203 12:37:05.402430 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:37:35 crc kubenswrapper[4711]: I1203 12:37:35.401302 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:37:35 crc kubenswrapper[4711]: I1203 12:37:35.401929 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:37:35 crc kubenswrapper[4711]: I1203 12:37:35.401981 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:37:35 crc kubenswrapper[4711]: I1203 12:37:35.402538 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2221b1d8dd35869dac84859caa67252140cee66906c26379e7532c6528f6458"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:37:35 crc kubenswrapper[4711]: I1203 12:37:35.402602 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://c2221b1d8dd35869dac84859caa67252140cee66906c26379e7532c6528f6458" gracePeriod=600 Dec 03 12:37:36 crc kubenswrapper[4711]: I1203 12:37:36.424989 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="c2221b1d8dd35869dac84859caa67252140cee66906c26379e7532c6528f6458" exitCode=0 Dec 03 12:37:36 crc kubenswrapper[4711]: I1203 12:37:36.425068 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"c2221b1d8dd35869dac84859caa67252140cee66906c26379e7532c6528f6458"} Dec 03 12:37:36 crc kubenswrapper[4711]: I1203 12:37:36.425304 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f"} Dec 03 12:37:36 crc kubenswrapper[4711]: I1203 12:37:36.425328 4711 scope.go:117] "RemoveContainer" containerID="9c8244cd04cab97f7bcedd65d783655e34619fe118586bb3e610a526227c7382" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.032793 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ct6xt"] Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.033823 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovn-controller" containerID="cri-o://53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e" gracePeriod=30 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.033875 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kube-rbac-proxy-node" containerID="cri-o://f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa" gracePeriod=30 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.033875 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="northd" containerID="cri-o://ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64" gracePeriod=30 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.033985 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovn-acl-logging" containerID="cri-o://6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0" gracePeriod=30 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.034067 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="nbdb" containerID="cri-o://8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a" gracePeriod=30 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.034066 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52" gracePeriod=30 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.034049 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="sbdb" containerID="cri-o://923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b" gracePeriod=30 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.081070 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" containerID="cri-o://320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" gracePeriod=30 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.387935 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/3.log" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.390486 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovn-acl-logging/0.log" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.391056 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovn-controller/0.log" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.391436 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446357 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4lz2d"] Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446668 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovn-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446683 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovn-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446696 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446704 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446740 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446750 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446758 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="nbdb" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446765 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="nbdb" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446773 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446780 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446790 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kube-rbac-proxy-node" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446819 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kube-rbac-proxy-node" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446829 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="sbdb" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446836 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="sbdb" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446847 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kubecfg-setup" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446854 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kubecfg-setup" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446863 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovn-acl-logging" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446871 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovn-acl-logging" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446915 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446923 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.446934 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="northd" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.446941 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="northd" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447108 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kube-rbac-proxy-node" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447145 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447154 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447163 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447171 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447180 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovn-acl-logging" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447191 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447222 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="sbdb" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447231 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovn-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447241 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="northd" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447248 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="nbdb" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.447393 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447403 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.447412 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447420 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.447575 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerName="ovnkube-controller" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.449231 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484201 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-etc-openvswitch\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484270 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-systemd\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484300 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-script-lib\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484337 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-slash\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484357 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-openvswitch\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484286 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484381 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovn-node-metrics-cert\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484396 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-config\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484423 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-ovn-kubernetes\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484453 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-systemd-units\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484447 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484488 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-slash" (OuterVolumeSpecName: "host-slash") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484493 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-node-log\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484548 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-node-log" (OuterVolumeSpecName: "node-log") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484552 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-ovn\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484579 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-kubelet\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484583 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484606 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484622 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-bin\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484646 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-netns\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484629 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484665 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-var-lib-openvswitch\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484711 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-log-socket\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484732 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484737 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484756 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-netd\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484759 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484783 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-env-overrides\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484787 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484813 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.484814 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh5vs\" (UniqueName: \"kubernetes.io/projected/33d2332f-fdac-42be-891e-7eaef0e7ca9d-kube-api-access-vh5vs\") pod \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\" (UID: \"33d2332f-fdac-42be-891e-7eaef0e7ca9d\") " Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485002 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485037 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485053 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485067 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-log-socket" (OuterVolumeSpecName: "log-socket") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485095 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485235 4711 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485251 4711 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485260 4711 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485268 4711 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485276 4711 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485286 4711 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485419 4711 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485429 4711 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485437 4711 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485454 4711 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485461 4711 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485470 4711 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485480 4711 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485490 4711 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485490 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485498 4711 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.485520 4711 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.492571 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d2332f-fdac-42be-891e-7eaef0e7ca9d-kube-api-access-vh5vs" (OuterVolumeSpecName: "kube-api-access-vh5vs") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "kube-api-access-vh5vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.492982 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.504226 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "33d2332f-fdac-42be-891e-7eaef0e7ca9d" (UID: "33d2332f-fdac-42be-891e-7eaef0e7ca9d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.552592 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovnkube-controller/3.log" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.554540 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovn-acl-logging/0.log" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.554982 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ct6xt_33d2332f-fdac-42be-891e-7eaef0e7ca9d/ovn-controller/0.log" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555295 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" exitCode=0 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555316 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b" exitCode=0 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555327 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a" exitCode=0 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555338 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64" exitCode=0 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555348 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52" exitCode=0 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555359 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa" exitCode=0 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555367 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0" exitCode=143 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555375 4711 generic.go:334] "Generic (PLEG): container finished" podID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" containerID="53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e" exitCode=143 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555381 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555406 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555472 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555490 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555504 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555515 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555528 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555540 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555552 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555559 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555565 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555571 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555578 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555585 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555594 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555600 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555609 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555617 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555624 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555630 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555637 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555644 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555650 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555656 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555662 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555668 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555674 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555684 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555694 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555702 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555585 4711 scope.go:117] "RemoveContainer" containerID="320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555708 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555802 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555808 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555813 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555817 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555822 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555827 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555832 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555841 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ct6xt" event={"ID":"33d2332f-fdac-42be-891e-7eaef0e7ca9d","Type":"ContainerDied","Data":"8e1603daf0b3d5bfc8122a5f6be62ae00b8257d9451d66d7ea4e101f253b4020"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555849 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555854 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555859 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555864 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555869 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555876 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555881 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555886 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555891 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.555896 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.558220 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/2.log" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.559036 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/1.log" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.559068 4711 generic.go:334] "Generic (PLEG): container finished" podID="216c3ac8-462c-49ec-87a2-c935d0c4ad25" containerID="1535e90b589a87c3e19fcbcae492e601aaffa008889481cd7a1ffad7b23a76ae" exitCode=2 Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.559092 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwhcr" event={"ID":"216c3ac8-462c-49ec-87a2-c935d0c4ad25","Type":"ContainerDied","Data":"1535e90b589a87c3e19fcbcae492e601aaffa008889481cd7a1ffad7b23a76ae"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.559110 4711 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093"} Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.559486 4711 scope.go:117] "RemoveContainer" containerID="1535e90b589a87c3e19fcbcae492e601aaffa008889481cd7a1ffad7b23a76ae" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.582159 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586183 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586235 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-ovnkube-config\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586266 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-cni-bin\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586295 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-ovnkube-script-lib\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586315 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-kubelet\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586333 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-log-socket\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586350 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-systemd\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586368 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17350065-93fe-4f5b-865f-6e2de45bb41b-ovn-node-metrics-cert\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586394 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-systemd-units\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586413 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-run-netns\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586442 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fsk\" (UniqueName: \"kubernetes.io/projected/17350065-93fe-4f5b-865f-6e2de45bb41b-kube-api-access-z5fsk\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586457 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-node-log\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586473 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-ovn\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586524 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-slash\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586548 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-etc-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586564 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586581 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-var-lib-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586596 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-cni-netd\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586623 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-env-overrides\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586645 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586685 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh5vs\" (UniqueName: \"kubernetes.io/projected/33d2332f-fdac-42be-891e-7eaef0e7ca9d-kube-api-access-vh5vs\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586696 4711 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33d2332f-fdac-42be-891e-7eaef0e7ca9d-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586707 4711 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33d2332f-fdac-42be-891e-7eaef0e7ca9d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.586718 4711 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33d2332f-fdac-42be-891e-7eaef0e7ca9d-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.605349 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ct6xt"] Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.609394 4711 scope.go:117] "RemoveContainer" containerID="923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.610596 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ct6xt"] Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.622102 4711 scope.go:117] "RemoveContainer" containerID="8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.634238 4711 scope.go:117] "RemoveContainer" containerID="ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.645713 4711 scope.go:117] "RemoveContainer" containerID="32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.658608 4711 scope.go:117] "RemoveContainer" containerID="f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.671220 4711 scope.go:117] "RemoveContainer" containerID="6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.685901 4711 scope.go:117] "RemoveContainer" containerID="53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687489 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-env-overrides\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687551 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687574 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687618 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-ovnkube-config\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687634 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-cni-bin\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687668 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687664 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-ovnkube-script-lib\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687704 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-kubelet\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687715 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-cni-bin\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687724 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-log-socket\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687744 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687742 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-systemd\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687786 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17350065-93fe-4f5b-865f-6e2de45bb41b-ovn-node-metrics-cert\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687804 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-systemd-units\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687820 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-run-netns\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.687868 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fsk\" (UniqueName: \"kubernetes.io/projected/17350065-93fe-4f5b-865f-6e2de45bb41b-kube-api-access-z5fsk\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688189 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-run-netns\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688254 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-systemd-units\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688283 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-kubelet\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688306 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-systemd\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688348 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-log-socket\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688406 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-node-log\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688437 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-node-log\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688468 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-ovn\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688492 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-slash\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688518 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-etc-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688544 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-etc-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688547 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-slash\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688584 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-run-ovn\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688595 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688623 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688638 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-var-lib-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688665 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-cni-netd\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688704 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-var-lib-openvswitch\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.688828 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17350065-93fe-4f5b-865f-6e2de45bb41b-host-cni-netd\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.689103 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-ovnkube-config\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.689152 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-ovnkube-script-lib\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.690118 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17350065-93fe-4f5b-865f-6e2de45bb41b-env-overrides\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.694192 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17350065-93fe-4f5b-865f-6e2de45bb41b-ovn-node-metrics-cert\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.703424 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fsk\" (UniqueName: \"kubernetes.io/projected/17350065-93fe-4f5b-865f-6e2de45bb41b-kube-api-access-z5fsk\") pod \"ovnkube-node-4lz2d\" (UID: \"17350065-93fe-4f5b-865f-6e2de45bb41b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.711505 4711 scope.go:117] "RemoveContainer" containerID="b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.725259 4711 scope.go:117] "RemoveContainer" containerID="320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.726039 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": container with ID starting with 320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce not found: ID does not exist" containerID="320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.726085 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} err="failed to get container status \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": rpc error: code = NotFound desc = could not find container \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": container with ID starting with 320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.726117 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.726533 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": container with ID starting with d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654 not found: ID does not exist" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.726564 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} err="failed to get container status \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": rpc error: code = NotFound desc = could not find container \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": container with ID starting with d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.726583 4711 scope.go:117] "RemoveContainer" containerID="923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.726842 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": container with ID starting with 923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b not found: ID does not exist" containerID="923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.726868 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} err="failed to get container status \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": rpc error: code = NotFound desc = could not find container \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": container with ID starting with 923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.726888 4711 scope.go:117] "RemoveContainer" containerID="8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.727211 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": container with ID starting with 8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a not found: ID does not exist" containerID="8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.727235 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} err="failed to get container status \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": rpc error: code = NotFound desc = could not find container \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": container with ID starting with 8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.727249 4711 scope.go:117] "RemoveContainer" containerID="ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.727512 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": container with ID starting with ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64 not found: ID does not exist" containerID="ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.727535 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} err="failed to get container status \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": rpc error: code = NotFound desc = could not find container \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": container with ID starting with ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.727551 4711 scope.go:117] "RemoveContainer" containerID="32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.727788 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": container with ID starting with 32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52 not found: ID does not exist" containerID="32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.727813 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} err="failed to get container status \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": rpc error: code = NotFound desc = could not find container \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": container with ID starting with 32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.727828 4711 scope.go:117] "RemoveContainer" containerID="f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.728108 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": container with ID starting with f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa not found: ID does not exist" containerID="f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.728135 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} err="failed to get container status \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": rpc error: code = NotFound desc = could not find container \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": container with ID starting with f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.728151 4711 scope.go:117] "RemoveContainer" containerID="6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.728457 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": container with ID starting with 6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0 not found: ID does not exist" containerID="6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.728483 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} err="failed to get container status \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": rpc error: code = NotFound desc = could not find container \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": container with ID starting with 6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.728498 4711 scope.go:117] "RemoveContainer" containerID="53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.728848 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": container with ID starting with 53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e not found: ID does not exist" containerID="53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.728871 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} err="failed to get container status \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": rpc error: code = NotFound desc = could not find container \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": container with ID starting with 53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.728882 4711 scope.go:117] "RemoveContainer" containerID="b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2" Dec 03 12:37:59 crc kubenswrapper[4711]: E1203 12:37:59.729115 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": container with ID starting with b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2 not found: ID does not exist" containerID="b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.729138 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} err="failed to get container status \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": rpc error: code = NotFound desc = could not find container \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": container with ID starting with b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.729156 4711 scope.go:117] "RemoveContainer" containerID="320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.729383 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} err="failed to get container status \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": rpc error: code = NotFound desc = could not find container \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": container with ID starting with 320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.729405 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.729632 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} err="failed to get container status \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": rpc error: code = NotFound desc = could not find container \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": container with ID starting with d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.729652 4711 scope.go:117] "RemoveContainer" containerID="923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.729838 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} err="failed to get container status \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": rpc error: code = NotFound desc = could not find container \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": container with ID starting with 923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.729856 4711 scope.go:117] "RemoveContainer" containerID="8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.730080 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} err="failed to get container status \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": rpc error: code = NotFound desc = could not find container \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": container with ID starting with 8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.730102 4711 scope.go:117] "RemoveContainer" containerID="ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.730389 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} err="failed to get container status \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": rpc error: code = NotFound desc = could not find container \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": container with ID starting with ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.730414 4711 scope.go:117] "RemoveContainer" containerID="32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.730615 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} err="failed to get container status \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": rpc error: code = NotFound desc = could not find container \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": container with ID starting with 32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.730634 4711 scope.go:117] "RemoveContainer" containerID="f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.730823 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} err="failed to get container status \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": rpc error: code = NotFound desc = could not find container \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": container with ID starting with f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.730840 4711 scope.go:117] "RemoveContainer" containerID="6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731017 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} err="failed to get container status \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": rpc error: code = NotFound desc = could not find container \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": container with ID starting with 6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731033 4711 scope.go:117] "RemoveContainer" containerID="53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731193 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} err="failed to get container status \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": rpc error: code = NotFound desc = could not find container \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": container with ID starting with 53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731210 4711 scope.go:117] "RemoveContainer" containerID="b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731434 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} err="failed to get container status \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": rpc error: code = NotFound desc = could not find container \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": container with ID starting with b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731450 4711 scope.go:117] "RemoveContainer" containerID="320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731708 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} err="failed to get container status \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": rpc error: code = NotFound desc = could not find container \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": container with ID starting with 320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731730 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731932 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} err="failed to get container status \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": rpc error: code = NotFound desc = could not find container \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": container with ID starting with d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.731951 4711 scope.go:117] "RemoveContainer" containerID="923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.732214 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} err="failed to get container status \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": rpc error: code = NotFound desc = could not find container \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": container with ID starting with 923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.732236 4711 scope.go:117] "RemoveContainer" containerID="8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.732443 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} err="failed to get container status \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": rpc error: code = NotFound desc = could not find container \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": container with ID starting with 8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.732462 4711 scope.go:117] "RemoveContainer" containerID="ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.732718 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} err="failed to get container status \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": rpc error: code = NotFound desc = could not find container \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": container with ID starting with ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.732738 4711 scope.go:117] "RemoveContainer" containerID="32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.733094 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} err="failed to get container status \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": rpc error: code = NotFound desc = could not find container \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": container with ID starting with 32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.733121 4711 scope.go:117] "RemoveContainer" containerID="f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.733365 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} err="failed to get container status \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": rpc error: code = NotFound desc = could not find container \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": container with ID starting with f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.733391 4711 scope.go:117] "RemoveContainer" containerID="6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.734222 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} err="failed to get container status \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": rpc error: code = NotFound desc = could not find container \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": container with ID starting with 6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.734243 4711 scope.go:117] "RemoveContainer" containerID="53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.734497 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} err="failed to get container status \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": rpc error: code = NotFound desc = could not find container \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": container with ID starting with 53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.734520 4711 scope.go:117] "RemoveContainer" containerID="b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.734750 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} err="failed to get container status \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": rpc error: code = NotFound desc = could not find container \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": container with ID starting with b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.734770 4711 scope.go:117] "RemoveContainer" containerID="320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.734986 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} err="failed to get container status \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": rpc error: code = NotFound desc = could not find container \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": container with ID starting with 320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735005 4711 scope.go:117] "RemoveContainer" containerID="d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735246 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654"} err="failed to get container status \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": rpc error: code = NotFound desc = could not find container \"d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654\": container with ID starting with d53b6336111c7d270701cb63eb421dbbd82b96be4e25660e9261e613ebd7b654 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735263 4711 scope.go:117] "RemoveContainer" containerID="923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735506 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b"} err="failed to get container status \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": rpc error: code = NotFound desc = could not find container \"923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b\": container with ID starting with 923b3e9a222a2cfdd935648d5d1fb76cba49d3626901a0b39c37ef7a2d8b9a0b not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735529 4711 scope.go:117] "RemoveContainer" containerID="8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735696 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a"} err="failed to get container status \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": rpc error: code = NotFound desc = could not find container \"8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a\": container with ID starting with 8d045bec9bf0289bf25453b436750fa6decf21a48ad6cb0515443f5d3f9f893a not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735712 4711 scope.go:117] "RemoveContainer" containerID="ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735900 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64"} err="failed to get container status \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": rpc error: code = NotFound desc = could not find container \"ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64\": container with ID starting with ed518e301e8c3eccd6d4c1d6b5863b354f4405747244665f990c1600bdda7c64 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.735931 4711 scope.go:117] "RemoveContainer" containerID="32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.736109 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52"} err="failed to get container status \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": rpc error: code = NotFound desc = could not find container \"32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52\": container with ID starting with 32d073f8e2ff37253a30d00af53d466f6e167e5e7bb89f3900fef7fac334ef52 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.736125 4711 scope.go:117] "RemoveContainer" containerID="f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.736348 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa"} err="failed to get container status \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": rpc error: code = NotFound desc = could not find container \"f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa\": container with ID starting with f037a90f39e13081e95d480a55f6087a87b7a7b4b55754fcf1d23f165c8e70aa not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.736365 4711 scope.go:117] "RemoveContainer" containerID="6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.736585 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0"} err="failed to get container status \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": rpc error: code = NotFound desc = could not find container \"6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0\": container with ID starting with 6d553a25aa04f0c0d12f8fbcdec01d035cf58498b6770a141175a49f8fa1c3e0 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.736605 4711 scope.go:117] "RemoveContainer" containerID="53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.736775 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e"} err="failed to get container status \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": rpc error: code = NotFound desc = could not find container \"53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e\": container with ID starting with 53a4f0c2b59d9cdcdf7043f2f23041a22b4865f8b55048564738b721e07c3e6e not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.736792 4711 scope.go:117] "RemoveContainer" containerID="b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.737019 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2"} err="failed to get container status \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": rpc error: code = NotFound desc = could not find container \"b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2\": container with ID starting with b62a504dfb2d740fb68d5695c25ff46857b97b339019f0bf16b303c21d01cda2 not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.737039 4711 scope.go:117] "RemoveContainer" containerID="320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.737366 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce"} err="failed to get container status \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": rpc error: code = NotFound desc = could not find container \"320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce\": container with ID starting with 320244777d4720bb13366e087a7baedee11af87b5ff6aa9d24e7621d19bf27ce not found: ID does not exist" Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.762568 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:37:59 crc kubenswrapper[4711]: W1203 12:37:59.784475 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17350065_93fe_4f5b_865f_6e2de45bb41b.slice/crio-481a0a89b1c910da9d2c58a1f3f1378adc57a1d8857fc7c043a83e00a45712fe WatchSource:0}: Error finding container 481a0a89b1c910da9d2c58a1f3f1378adc57a1d8857fc7c043a83e00a45712fe: Status 404 returned error can't find the container with id 481a0a89b1c910da9d2c58a1f3f1378adc57a1d8857fc7c043a83e00a45712fe Dec 03 12:37:59 crc kubenswrapper[4711]: I1203 12:37:59.825180 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d2332f-fdac-42be-891e-7eaef0e7ca9d" path="/var/lib/kubelet/pods/33d2332f-fdac-42be-891e-7eaef0e7ca9d/volumes" Dec 03 12:38:00 crc kubenswrapper[4711]: I1203 12:38:00.569395 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/2.log" Dec 03 12:38:00 crc kubenswrapper[4711]: I1203 12:38:00.570717 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/1.log" Dec 03 12:38:00 crc kubenswrapper[4711]: I1203 12:38:00.570782 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwhcr" event={"ID":"216c3ac8-462c-49ec-87a2-c935d0c4ad25","Type":"ContainerStarted","Data":"dfaf43bc90f11674063646a3357308374cbadcc3627b117110232618f3da36dc"} Dec 03 12:38:00 crc kubenswrapper[4711]: I1203 12:38:00.585505 4711 generic.go:334] "Generic (PLEG): container finished" podID="17350065-93fe-4f5b-865f-6e2de45bb41b" containerID="306e09bfc12e469b674800b7bb864f085ee22a8e3912923824ed38d1cf1c608a" exitCode=0 Dec 03 12:38:00 crc kubenswrapper[4711]: I1203 12:38:00.585627 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerDied","Data":"306e09bfc12e469b674800b7bb864f085ee22a8e3912923824ed38d1cf1c608a"} Dec 03 12:38:00 crc kubenswrapper[4711]: I1203 12:38:00.585656 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"481a0a89b1c910da9d2c58a1f3f1378adc57a1d8857fc7c043a83e00a45712fe"} Dec 03 12:38:01 crc kubenswrapper[4711]: I1203 12:38:01.596897 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"2578b752a7c585081c48c6d2032e624aa4ebd15416a2b88a573cb1fb012badba"} Dec 03 12:38:01 crc kubenswrapper[4711]: I1203 12:38:01.597258 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"906a3bfa7e6d9e65d95e83f5876afd84974a7b6d2e98fe3936627d2ffef3ead1"} Dec 03 12:38:01 crc kubenswrapper[4711]: I1203 12:38:01.597272 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"3b7b6b1779d588bd5b00857aec268a96cdfc24b497375bd942f7e62c9df6d00e"} Dec 03 12:38:01 crc kubenswrapper[4711]: I1203 12:38:01.597283 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"b189807be22ed3283f216b9455d913e841caaf5611af3e7fa17074c2675cfbfa"} Dec 03 12:38:02 crc kubenswrapper[4711]: I1203 12:38:02.608194 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"b653f72ca0795c2229b17cd0c94d5e14993d02d6dc8f42fc3ed287abb2cbf0eb"} Dec 03 12:38:02 crc kubenswrapper[4711]: I1203 12:38:02.608605 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"8b49b1e4de6eeb2c6278be4d90dc163872aba494d50facabebc899ac6fe8f2ac"} Dec 03 12:38:04 crc kubenswrapper[4711]: I1203 12:38:04.627610 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"68aea0b9b74e1cef96abb134e38eeed1ebd903dc13cdf7b34788dbe433579184"} Dec 03 12:38:07 crc kubenswrapper[4711]: I1203 12:38:07.645944 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" event={"ID":"17350065-93fe-4f5b-865f-6e2de45bb41b","Type":"ContainerStarted","Data":"6039c8f88e5f96827b62c8e3e06a65eef577d550e6d6fb675fc232d051af7145"} Dec 03 12:38:07 crc kubenswrapper[4711]: I1203 12:38:07.646259 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:38:07 crc kubenswrapper[4711]: I1203 12:38:07.646272 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:38:07 crc kubenswrapper[4711]: I1203 12:38:07.672902 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:38:07 crc kubenswrapper[4711]: I1203 12:38:07.675065 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" podStartSLOduration=8.675050549 podStartE2EDuration="8.675050549s" podCreationTimestamp="2025-12-03 12:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:38:07.672586171 +0000 UTC m=+1406.341837436" watchObservedRunningTime="2025-12-03 12:38:07.675050549 +0000 UTC m=+1406.344301814" Dec 03 12:38:08 crc kubenswrapper[4711]: I1203 12:38:08.653677 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:38:08 crc kubenswrapper[4711]: I1203 12:38:08.679989 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:38:29 crc kubenswrapper[4711]: I1203 12:38:29.802543 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4lz2d" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.373469 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms"] Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.377797 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.379803 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.381577 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms"] Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.419339 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.419452 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.419498 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h87pf\" (UniqueName: \"kubernetes.io/projected/19e4b241-651d-4300-9adb-3ef74168d5f7-kube-api-access-h87pf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.520852 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.520903 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h87pf\" (UniqueName: \"kubernetes.io/projected/19e4b241-651d-4300-9adb-3ef74168d5f7-kube-api-access-h87pf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.520986 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.521656 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.521675 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.545273 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h87pf\" (UniqueName: \"kubernetes.io/projected/19e4b241-651d-4300-9adb-3ef74168d5f7-kube-api-access-h87pf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.704859 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:32 crc kubenswrapper[4711]: I1203 12:38:32.901261 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms"] Dec 03 12:38:33 crc kubenswrapper[4711]: I1203 12:38:33.815482 4711 generic.go:334] "Generic (PLEG): container finished" podID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerID="88135ad65a7e381ea3b8c4e32808a65c8481aa62e9e0e898e8b78096a2d1eff4" exitCode=0 Dec 03 12:38:33 crc kubenswrapper[4711]: I1203 12:38:33.815623 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" event={"ID":"19e4b241-651d-4300-9adb-3ef74168d5f7","Type":"ContainerDied","Data":"88135ad65a7e381ea3b8c4e32808a65c8481aa62e9e0e898e8b78096a2d1eff4"} Dec 03 12:38:33 crc kubenswrapper[4711]: I1203 12:38:33.815894 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" event={"ID":"19e4b241-651d-4300-9adb-3ef74168d5f7","Type":"ContainerStarted","Data":"83c0a956a82b29ed261a3846978a1819e9dfba63e93a947d4fe2fd096da16d4d"} Dec 03 12:38:33 crc kubenswrapper[4711]: I1203 12:38:33.820401 4711 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:38:35 crc kubenswrapper[4711]: I1203 12:38:35.828486 4711 generic.go:334] "Generic (PLEG): container finished" podID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerID="daad5f1621fd703d10eea6978e38ef5d661f9320319f34d5e06b993435d94642" exitCode=0 Dec 03 12:38:35 crc kubenswrapper[4711]: I1203 12:38:35.828556 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" event={"ID":"19e4b241-651d-4300-9adb-3ef74168d5f7","Type":"ContainerDied","Data":"daad5f1621fd703d10eea6978e38ef5d661f9320319f34d5e06b993435d94642"} Dec 03 12:38:36 crc kubenswrapper[4711]: I1203 12:38:36.835436 4711 generic.go:334] "Generic (PLEG): container finished" podID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerID="0a0a8f582494c12da475f23a5deeb1ea39fe5dc69951110673d196bfad8df5fc" exitCode=0 Dec 03 12:38:36 crc kubenswrapper[4711]: I1203 12:38:36.835560 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" event={"ID":"19e4b241-651d-4300-9adb-3ef74168d5f7","Type":"ContainerDied","Data":"0a0a8f582494c12da475f23a5deeb1ea39fe5dc69951110673d196bfad8df5fc"} Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.066635 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.195113 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-util\") pod \"19e4b241-651d-4300-9adb-3ef74168d5f7\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.195290 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-bundle\") pod \"19e4b241-651d-4300-9adb-3ef74168d5f7\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.195328 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h87pf\" (UniqueName: \"kubernetes.io/projected/19e4b241-651d-4300-9adb-3ef74168d5f7-kube-api-access-h87pf\") pod \"19e4b241-651d-4300-9adb-3ef74168d5f7\" (UID: \"19e4b241-651d-4300-9adb-3ef74168d5f7\") " Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.197346 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-bundle" (OuterVolumeSpecName: "bundle") pod "19e4b241-651d-4300-9adb-3ef74168d5f7" (UID: "19e4b241-651d-4300-9adb-3ef74168d5f7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.200596 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e4b241-651d-4300-9adb-3ef74168d5f7-kube-api-access-h87pf" (OuterVolumeSpecName: "kube-api-access-h87pf") pod "19e4b241-651d-4300-9adb-3ef74168d5f7" (UID: "19e4b241-651d-4300-9adb-3ef74168d5f7"). InnerVolumeSpecName "kube-api-access-h87pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.296708 4711 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.296751 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h87pf\" (UniqueName: \"kubernetes.io/projected/19e4b241-651d-4300-9adb-3ef74168d5f7-kube-api-access-h87pf\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.445157 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-util" (OuterVolumeSpecName: "util") pod "19e4b241-651d-4300-9adb-3ef74168d5f7" (UID: "19e4b241-651d-4300-9adb-3ef74168d5f7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.499164 4711 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e4b241-651d-4300-9adb-3ef74168d5f7-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.850057 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" event={"ID":"19e4b241-651d-4300-9adb-3ef74168d5f7","Type":"ContainerDied","Data":"83c0a956a82b29ed261a3846978a1819e9dfba63e93a947d4fe2fd096da16d4d"} Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.850098 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c0a956a82b29ed261a3846978a1819e9dfba63e93a947d4fe2fd096da16d4d" Dec 03 12:38:38 crc kubenswrapper[4711]: I1203 12:38:38.850148 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms" Dec 03 12:38:38 crc kubenswrapper[4711]: E1203 12:38:38.941860 4711 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e4b241_651d_4300_9adb_3ef74168d5f7.slice\": RecentStats: unable to find data in memory cache]" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.521604 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wxhxr"] Dec 03 12:38:40 crc kubenswrapper[4711]: E1203 12:38:40.521846 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerName="util" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.521859 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerName="util" Dec 03 12:38:40 crc kubenswrapper[4711]: E1203 12:38:40.521873 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerName="pull" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.521880 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerName="pull" Dec 03 12:38:40 crc kubenswrapper[4711]: E1203 12:38:40.521898 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerName="extract" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.521909 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerName="extract" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.522039 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e4b241-651d-4300-9adb-3ef74168d5f7" containerName="extract" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.522960 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.533690 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxhxr"] Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.636058 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-catalog-content\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.636383 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-utilities\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.636526 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62x58\" (UniqueName: \"kubernetes.io/projected/390ba45f-99af-49a8-bdaf-b28a0d052b0e-kube-api-access-62x58\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.737590 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-utilities\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.737663 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62x58\" (UniqueName: \"kubernetes.io/projected/390ba45f-99af-49a8-bdaf-b28a0d052b0e-kube-api-access-62x58\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.737698 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-catalog-content\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.738176 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-utilities\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.738217 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-catalog-content\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.756264 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62x58\" (UniqueName: \"kubernetes.io/projected/390ba45f-99af-49a8-bdaf-b28a0d052b0e-kube-api-access-62x58\") pod \"certified-operators-wxhxr\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:40 crc kubenswrapper[4711]: I1203 12:38:40.837311 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:41 crc kubenswrapper[4711]: I1203 12:38:41.255890 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxhxr"] Dec 03 12:38:41 crc kubenswrapper[4711]: I1203 12:38:41.864801 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhxr" event={"ID":"390ba45f-99af-49a8-bdaf-b28a0d052b0e","Type":"ContainerStarted","Data":"6a550ff5e7b072bbf2b91e50f2043e8e0c07c281821364e7e38427cf578b0a03"} Dec 03 12:38:42 crc kubenswrapper[4711]: I1203 12:38:42.355349 4711 scope.go:117] "RemoveContainer" containerID="0ad2a184bbabe1f39e87385729c4f5006623e99a5008fff374ff8754bba2f093" Dec 03 12:38:42 crc kubenswrapper[4711]: I1203 12:38:42.869862 4711 generic.go:334] "Generic (PLEG): container finished" podID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerID="81f2c95d8dc3e9d01ee594a197024b81816223fd8de1d3ec8733da379ac98f8a" exitCode=0 Dec 03 12:38:42 crc kubenswrapper[4711]: I1203 12:38:42.869946 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhxr" event={"ID":"390ba45f-99af-49a8-bdaf-b28a0d052b0e","Type":"ContainerDied","Data":"81f2c95d8dc3e9d01ee594a197024b81816223fd8de1d3ec8733da379ac98f8a"} Dec 03 12:38:42 crc kubenswrapper[4711]: I1203 12:38:42.896670 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwhcr_216c3ac8-462c-49ec-87a2-c935d0c4ad25/kube-multus/2.log" Dec 03 12:38:44 crc kubenswrapper[4711]: I1203 12:38:44.914035 4711 generic.go:334] "Generic (PLEG): container finished" podID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerID="68a6329beb6557e66a9571c3e950e6e23b869fe03bb6cc6c688555651af3f0a1" exitCode=0 Dec 03 12:38:44 crc kubenswrapper[4711]: I1203 12:38:44.914164 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhxr" event={"ID":"390ba45f-99af-49a8-bdaf-b28a0d052b0e","Type":"ContainerDied","Data":"68a6329beb6557e66a9571c3e950e6e23b869fe03bb6cc6c688555651af3f0a1"} Dec 03 12:38:45 crc kubenswrapper[4711]: I1203 12:38:45.921707 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhxr" event={"ID":"390ba45f-99af-49a8-bdaf-b28a0d052b0e","Type":"ContainerStarted","Data":"e43386ab33c58e631bc5ce95db7a146e38ca732772a5e697aa4344c61ab0a734"} Dec 03 12:38:45 crc kubenswrapper[4711]: I1203 12:38:45.939527 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wxhxr" podStartSLOduration=3.491130197 podStartE2EDuration="5.93950958s" podCreationTimestamp="2025-12-03 12:38:40 +0000 UTC" firstStartedPulling="2025-12-03 12:38:42.875563242 +0000 UTC m=+1441.544814497" lastFinishedPulling="2025-12-03 12:38:45.323942625 +0000 UTC m=+1443.993193880" observedRunningTime="2025-12-03 12:38:45.935229192 +0000 UTC m=+1444.604480477" watchObservedRunningTime="2025-12-03 12:38:45.93950958 +0000 UTC m=+1444.608760855" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.435339 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs"] Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.436197 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.438429 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.438491 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.438636 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lwp4t" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.438708 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.439319 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.458799 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs"] Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.517209 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrnkx\" (UniqueName: \"kubernetes.io/projected/25892076-3b01-4a62-884a-3d658b400d60-kube-api-access-wrnkx\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.517308 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25892076-3b01-4a62-884a-3d658b400d60-webhook-cert\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.517346 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25892076-3b01-4a62-884a-3d658b400d60-apiservice-cert\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.618074 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrnkx\" (UniqueName: \"kubernetes.io/projected/25892076-3b01-4a62-884a-3d658b400d60-kube-api-access-wrnkx\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.618556 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25892076-3b01-4a62-884a-3d658b400d60-webhook-cert\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.620020 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25892076-3b01-4a62-884a-3d658b400d60-apiservice-cert\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.630902 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25892076-3b01-4a62-884a-3d658b400d60-apiservice-cert\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.643707 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrnkx\" (UniqueName: \"kubernetes.io/projected/25892076-3b01-4a62-884a-3d658b400d60-kube-api-access-wrnkx\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.652197 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25892076-3b01-4a62-884a-3d658b400d60-webhook-cert\") pod \"metallb-operator-controller-manager-78f9c874c6-vlmhs\" (UID: \"25892076-3b01-4a62-884a-3d658b400d60\") " pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.756827 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.791971 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4"] Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.794891 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.798199 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bc9r6" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.798634 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.800812 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.815444 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4"] Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.924083 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f63d4a78-0d83-4e48-a531-86e871adfa2d-webhook-cert\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.924707 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmsz\" (UniqueName: \"kubernetes.io/projected/f63d4a78-0d83-4e48-a531-86e871adfa2d-kube-api-access-ppmsz\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:47 crc kubenswrapper[4711]: I1203 12:38:47.924740 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f63d4a78-0d83-4e48-a531-86e871adfa2d-apiservice-cert\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.027159 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f63d4a78-0d83-4e48-a531-86e871adfa2d-webhook-cert\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.027200 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppmsz\" (UniqueName: \"kubernetes.io/projected/f63d4a78-0d83-4e48-a531-86e871adfa2d-kube-api-access-ppmsz\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.027224 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f63d4a78-0d83-4e48-a531-86e871adfa2d-apiservice-cert\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.034728 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f63d4a78-0d83-4e48-a531-86e871adfa2d-webhook-cert\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.038977 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f63d4a78-0d83-4e48-a531-86e871adfa2d-apiservice-cert\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.044178 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs"] Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.049769 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppmsz\" (UniqueName: \"kubernetes.io/projected/f63d4a78-0d83-4e48-a531-86e871adfa2d-kube-api-access-ppmsz\") pod \"metallb-operator-webhook-server-5bc7f8779c-bxqk4\" (UID: \"f63d4a78-0d83-4e48-a531-86e871adfa2d\") " pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.136876 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.346327 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4"] Dec 03 12:38:48 crc kubenswrapper[4711]: W1203 12:38:48.357850 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf63d4a78_0d83_4e48_a531_86e871adfa2d.slice/crio-1d74ad4e6e7cc2f825c8cea94590749fb2907c2e0a1762ff55b6c8594c653ac8 WatchSource:0}: Error finding container 1d74ad4e6e7cc2f825c8cea94590749fb2907c2e0a1762ff55b6c8594c653ac8: Status 404 returned error can't find the container with id 1d74ad4e6e7cc2f825c8cea94590749fb2907c2e0a1762ff55b6c8594c653ac8 Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.947578 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" event={"ID":"f63d4a78-0d83-4e48-a531-86e871adfa2d","Type":"ContainerStarted","Data":"1d74ad4e6e7cc2f825c8cea94590749fb2907c2e0a1762ff55b6c8594c653ac8"} Dec 03 12:38:48 crc kubenswrapper[4711]: I1203 12:38:48.949655 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" event={"ID":"25892076-3b01-4a62-884a-3d658b400d60","Type":"ContainerStarted","Data":"e04ab9cb34bdaf7ce54efeb2487f5ea49045d8586352dd5d2610f3c5dd996e8e"} Dec 03 12:38:50 crc kubenswrapper[4711]: I1203 12:38:50.837735 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:50 crc kubenswrapper[4711]: I1203 12:38:50.838115 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:50 crc kubenswrapper[4711]: I1203 12:38:50.908709 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:51 crc kubenswrapper[4711]: I1203 12:38:51.030240 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:51 crc kubenswrapper[4711]: I1203 12:38:51.927175 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n529s"] Dec 03 12:38:51 crc kubenswrapper[4711]: I1203 12:38:51.933031 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:51 crc kubenswrapper[4711]: I1203 12:38:51.935742 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n529s"] Dec 03 12:38:51 crc kubenswrapper[4711]: I1203 12:38:51.986709 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-catalog-content\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:51 crc kubenswrapper[4711]: I1203 12:38:51.986869 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fvww\" (UniqueName: \"kubernetes.io/projected/fd1ee230-c82c-4991-a2b0-305def70a5fd-kube-api-access-4fvww\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:51 crc kubenswrapper[4711]: I1203 12:38:51.986901 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-utilities\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:52 crc kubenswrapper[4711]: I1203 12:38:52.087922 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fvww\" (UniqueName: \"kubernetes.io/projected/fd1ee230-c82c-4991-a2b0-305def70a5fd-kube-api-access-4fvww\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:52 crc kubenswrapper[4711]: I1203 12:38:52.087969 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-utilities\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:52 crc kubenswrapper[4711]: I1203 12:38:52.087997 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-catalog-content\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:52 crc kubenswrapper[4711]: I1203 12:38:52.088796 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-utilities\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:52 crc kubenswrapper[4711]: I1203 12:38:52.088853 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-catalog-content\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:52 crc kubenswrapper[4711]: I1203 12:38:52.122831 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fvww\" (UniqueName: \"kubernetes.io/projected/fd1ee230-c82c-4991-a2b0-305def70a5fd-kube-api-access-4fvww\") pod \"community-operators-n529s\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:52 crc kubenswrapper[4711]: I1203 12:38:52.258034 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n529s" Dec 03 12:38:53 crc kubenswrapper[4711]: I1203 12:38:53.623612 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n529s"] Dec 03 12:38:53 crc kubenswrapper[4711]: I1203 12:38:53.989495 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" event={"ID":"25892076-3b01-4a62-884a-3d658b400d60","Type":"ContainerStarted","Data":"48fcf341dcafce16d10b5f9c32df72a903d1fe054c4c4215a542472b4b22aac4"} Dec 03 12:38:53 crc kubenswrapper[4711]: I1203 12:38:53.990069 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:38:53 crc kubenswrapper[4711]: I1203 12:38:53.992022 4711 generic.go:334] "Generic (PLEG): container finished" podID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerID="b2858439715214fd7a7df820623627984b0a5c312d6d53d6130a0447690d7e37" exitCode=0 Dec 03 12:38:53 crc kubenswrapper[4711]: I1203 12:38:53.992118 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n529s" event={"ID":"fd1ee230-c82c-4991-a2b0-305def70a5fd","Type":"ContainerDied","Data":"b2858439715214fd7a7df820623627984b0a5c312d6d53d6130a0447690d7e37"} Dec 03 12:38:53 crc kubenswrapper[4711]: I1203 12:38:53.992157 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n529s" event={"ID":"fd1ee230-c82c-4991-a2b0-305def70a5fd","Type":"ContainerStarted","Data":"96ef199c46d7b5bbea2f1dbd99a5b6197283325619118dbbc26bbaa803377a64"} Dec 03 12:38:53 crc kubenswrapper[4711]: I1203 12:38:53.993951 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" event={"ID":"f63d4a78-0d83-4e48-a531-86e871adfa2d","Type":"ContainerStarted","Data":"ad71562189b7cb1b8816f6dcde38bf9b0c88cc047cf323fa9b109a24ab22f450"} Dec 03 12:38:53 crc kubenswrapper[4711]: I1203 12:38:53.994146 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:38:54 crc kubenswrapper[4711]: I1203 12:38:54.020189 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" podStartSLOduration=3.6166478079999997 podStartE2EDuration="7.020167105s" podCreationTimestamp="2025-12-03 12:38:47 +0000 UTC" firstStartedPulling="2025-12-03 12:38:48.047378668 +0000 UTC m=+1446.716629913" lastFinishedPulling="2025-12-03 12:38:51.450897965 +0000 UTC m=+1450.120149210" observedRunningTime="2025-12-03 12:38:54.018074087 +0000 UTC m=+1452.687325372" watchObservedRunningTime="2025-12-03 12:38:54.020167105 +0000 UTC m=+1452.689418370" Dec 03 12:38:54 crc kubenswrapper[4711]: I1203 12:38:54.055398 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" podStartSLOduration=2.011956962 podStartE2EDuration="7.055370756s" podCreationTimestamp="2025-12-03 12:38:47 +0000 UTC" firstStartedPulling="2025-12-03 12:38:48.360546219 +0000 UTC m=+1447.029797474" lastFinishedPulling="2025-12-03 12:38:53.403960023 +0000 UTC m=+1452.073211268" observedRunningTime="2025-12-03 12:38:54.053613818 +0000 UTC m=+1452.722865083" watchObservedRunningTime="2025-12-03 12:38:54.055370756 +0000 UTC m=+1452.724622011" Dec 03 12:38:54 crc kubenswrapper[4711]: I1203 12:38:54.513406 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wxhxr"] Dec 03 12:38:54 crc kubenswrapper[4711]: I1203 12:38:54.513610 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wxhxr" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerName="registry-server" containerID="cri-o://e43386ab33c58e631bc5ce95db7a146e38ca732772a5e697aa4344c61ab0a734" gracePeriod=2 Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.009474 4711 generic.go:334] "Generic (PLEG): container finished" podID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerID="e43386ab33c58e631bc5ce95db7a146e38ca732772a5e697aa4344c61ab0a734" exitCode=0 Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.009530 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhxr" event={"ID":"390ba45f-99af-49a8-bdaf-b28a0d052b0e","Type":"ContainerDied","Data":"e43386ab33c58e631bc5ce95db7a146e38ca732772a5e697aa4344c61ab0a734"} Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.673836 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.772268 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62x58\" (UniqueName: \"kubernetes.io/projected/390ba45f-99af-49a8-bdaf-b28a0d052b0e-kube-api-access-62x58\") pod \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.772349 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-utilities\") pod \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.772394 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-catalog-content\") pod \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\" (UID: \"390ba45f-99af-49a8-bdaf-b28a0d052b0e\") " Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.774452 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-utilities" (OuterVolumeSpecName: "utilities") pod "390ba45f-99af-49a8-bdaf-b28a0d052b0e" (UID: "390ba45f-99af-49a8-bdaf-b28a0d052b0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.780063 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390ba45f-99af-49a8-bdaf-b28a0d052b0e-kube-api-access-62x58" (OuterVolumeSpecName: "kube-api-access-62x58") pod "390ba45f-99af-49a8-bdaf-b28a0d052b0e" (UID: "390ba45f-99af-49a8-bdaf-b28a0d052b0e"). InnerVolumeSpecName "kube-api-access-62x58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.820161 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "390ba45f-99af-49a8-bdaf-b28a0d052b0e" (UID: "390ba45f-99af-49a8-bdaf-b28a0d052b0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.873448 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62x58\" (UniqueName: \"kubernetes.io/projected/390ba45f-99af-49a8-bdaf-b28a0d052b0e-kube-api-access-62x58\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.873485 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:56 crc kubenswrapper[4711]: I1203 12:38:56.873496 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390ba45f-99af-49a8-bdaf-b28a0d052b0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.017352 4711 generic.go:334] "Generic (PLEG): container finished" podID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerID="58bbf1fec03e2004445170c340ad20f648fbb66f7d49e516bcc171e249164f87" exitCode=0 Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.017482 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n529s" event={"ID":"fd1ee230-c82c-4991-a2b0-305def70a5fd","Type":"ContainerDied","Data":"58bbf1fec03e2004445170c340ad20f648fbb66f7d49e516bcc171e249164f87"} Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.020044 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhxr" event={"ID":"390ba45f-99af-49a8-bdaf-b28a0d052b0e","Type":"ContainerDied","Data":"6a550ff5e7b072bbf2b91e50f2043e8e0c07c281821364e7e38427cf578b0a03"} Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.020086 4711 scope.go:117] "RemoveContainer" containerID="e43386ab33c58e631bc5ce95db7a146e38ca732772a5e697aa4344c61ab0a734" Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.020259 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxhxr" Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.035606 4711 scope.go:117] "RemoveContainer" containerID="68a6329beb6557e66a9571c3e950e6e23b869fe03bb6cc6c688555651af3f0a1" Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.054109 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wxhxr"] Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.057833 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wxhxr"] Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.063808 4711 scope.go:117] "RemoveContainer" containerID="81f2c95d8dc3e9d01ee594a197024b81816223fd8de1d3ec8733da379ac98f8a" Dec 03 12:38:57 crc kubenswrapper[4711]: I1203 12:38:57.826696 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" path="/var/lib/kubelet/pods/390ba45f-99af-49a8-bdaf-b28a0d052b0e/volumes" Dec 03 12:38:59 crc kubenswrapper[4711]: I1203 12:38:59.043331 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n529s" event={"ID":"fd1ee230-c82c-4991-a2b0-305def70a5fd","Type":"ContainerStarted","Data":"a1847eef1e979b580f4aac24608cd17716c75b33d7e8e89f1e263592dab13a5b"} Dec 03 12:38:59 crc kubenswrapper[4711]: I1203 12:38:59.062951 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n529s" podStartSLOduration=4.090757494 podStartE2EDuration="8.062930982s" podCreationTimestamp="2025-12-03 12:38:51 +0000 UTC" firstStartedPulling="2025-12-03 12:38:53.994351142 +0000 UTC m=+1452.663602397" lastFinishedPulling="2025-12-03 12:38:57.96652463 +0000 UTC m=+1456.635775885" observedRunningTime="2025-12-03 12:38:59.060285509 +0000 UTC m=+1457.729536764" watchObservedRunningTime="2025-12-03 12:38:59.062930982 +0000 UTC m=+1457.732182237" Dec 03 12:39:02 crc kubenswrapper[4711]: I1203 12:39:02.258449 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n529s" Dec 03 12:39:02 crc kubenswrapper[4711]: I1203 12:39:02.258729 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n529s" Dec 03 12:39:02 crc kubenswrapper[4711]: I1203 12:39:02.328771 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n529s" Dec 03 12:39:03 crc kubenswrapper[4711]: I1203 12:39:03.158350 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n529s" Dec 03 12:39:05 crc kubenswrapper[4711]: I1203 12:39:05.513747 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n529s"] Dec 03 12:39:05 crc kubenswrapper[4711]: I1203 12:39:05.515106 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n529s" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerName="registry-server" containerID="cri-o://a1847eef1e979b580f4aac24608cd17716c75b33d7e8e89f1e263592dab13a5b" gracePeriod=2 Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.089521 4711 generic.go:334] "Generic (PLEG): container finished" podID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerID="a1847eef1e979b580f4aac24608cd17716c75b33d7e8e89f1e263592dab13a5b" exitCode=0 Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.089719 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n529s" event={"ID":"fd1ee230-c82c-4991-a2b0-305def70a5fd","Type":"ContainerDied","Data":"a1847eef1e979b580f4aac24608cd17716c75b33d7e8e89f1e263592dab13a5b"} Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.270294 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n529s" Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.412377 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fvww\" (UniqueName: \"kubernetes.io/projected/fd1ee230-c82c-4991-a2b0-305def70a5fd-kube-api-access-4fvww\") pod \"fd1ee230-c82c-4991-a2b0-305def70a5fd\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.412475 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-utilities\") pod \"fd1ee230-c82c-4991-a2b0-305def70a5fd\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.412513 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-catalog-content\") pod \"fd1ee230-c82c-4991-a2b0-305def70a5fd\" (UID: \"fd1ee230-c82c-4991-a2b0-305def70a5fd\") " Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.413747 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-utilities" (OuterVolumeSpecName: "utilities") pod "fd1ee230-c82c-4991-a2b0-305def70a5fd" (UID: "fd1ee230-c82c-4991-a2b0-305def70a5fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.421303 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd1ee230-c82c-4991-a2b0-305def70a5fd-kube-api-access-4fvww" (OuterVolumeSpecName: "kube-api-access-4fvww") pod "fd1ee230-c82c-4991-a2b0-305def70a5fd" (UID: "fd1ee230-c82c-4991-a2b0-305def70a5fd"). InnerVolumeSpecName "kube-api-access-4fvww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.472174 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd1ee230-c82c-4991-a2b0-305def70a5fd" (UID: "fd1ee230-c82c-4991-a2b0-305def70a5fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.514804 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fvww\" (UniqueName: \"kubernetes.io/projected/fd1ee230-c82c-4991-a2b0-305def70a5fd-kube-api-access-4fvww\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.514833 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:07 crc kubenswrapper[4711]: I1203 12:39:07.514842 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1ee230-c82c-4991-a2b0-305def70a5fd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:08 crc kubenswrapper[4711]: I1203 12:39:08.098830 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n529s" event={"ID":"fd1ee230-c82c-4991-a2b0-305def70a5fd","Type":"ContainerDied","Data":"96ef199c46d7b5bbea2f1dbd99a5b6197283325619118dbbc26bbaa803377a64"} Dec 03 12:39:08 crc kubenswrapper[4711]: I1203 12:39:08.098876 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n529s" Dec 03 12:39:08 crc kubenswrapper[4711]: I1203 12:39:08.098897 4711 scope.go:117] "RemoveContainer" containerID="a1847eef1e979b580f4aac24608cd17716c75b33d7e8e89f1e263592dab13a5b" Dec 03 12:39:08 crc kubenswrapper[4711]: I1203 12:39:08.118458 4711 scope.go:117] "RemoveContainer" containerID="58bbf1fec03e2004445170c340ad20f648fbb66f7d49e516bcc171e249164f87" Dec 03 12:39:08 crc kubenswrapper[4711]: I1203 12:39:08.121571 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n529s"] Dec 03 12:39:08 crc kubenswrapper[4711]: I1203 12:39:08.126984 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n529s"] Dec 03 12:39:08 crc kubenswrapper[4711]: I1203 12:39:08.135678 4711 scope.go:117] "RemoveContainer" containerID="b2858439715214fd7a7df820623627984b0a5c312d6d53d6130a0447690d7e37" Dec 03 12:39:08 crc kubenswrapper[4711]: I1203 12:39:08.144334 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5bc7f8779c-bxqk4" Dec 03 12:39:09 crc kubenswrapper[4711]: I1203 12:39:09.823503 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" path="/var/lib/kubelet/pods/fd1ee230-c82c-4991-a2b0-305def70a5fd/volumes" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812506 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kgqzn"] Dec 03 12:39:12 crc kubenswrapper[4711]: E1203 12:39:12.812722 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerName="extract-utilities" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812739 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerName="extract-utilities" Dec 03 12:39:12 crc kubenswrapper[4711]: E1203 12:39:12.812750 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerName="extract-utilities" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812758 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerName="extract-utilities" Dec 03 12:39:12 crc kubenswrapper[4711]: E1203 12:39:12.812767 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerName="registry-server" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812774 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerName="registry-server" Dec 03 12:39:12 crc kubenswrapper[4711]: E1203 12:39:12.812787 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerName="registry-server" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812794 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerName="registry-server" Dec 03 12:39:12 crc kubenswrapper[4711]: E1203 12:39:12.812805 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerName="extract-content" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812812 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerName="extract-content" Dec 03 12:39:12 crc kubenswrapper[4711]: E1203 12:39:12.812821 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerName="extract-content" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812828 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerName="extract-content" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812952 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="390ba45f-99af-49a8-bdaf-b28a0d052b0e" containerName="registry-server" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.812970 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd1ee230-c82c-4991-a2b0-305def70a5fd" containerName="registry-server" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.813692 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.828747 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgqzn"] Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.886016 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-catalog-content\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.886095 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8ztz\" (UniqueName: \"kubernetes.io/projected/56322560-09d0-47d3-a596-06956845fcf1-kube-api-access-s8ztz\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.886139 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-utilities\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.986753 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-catalog-content\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.987095 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8ztz\" (UniqueName: \"kubernetes.io/projected/56322560-09d0-47d3-a596-06956845fcf1-kube-api-access-s8ztz\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.987140 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-utilities\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.987307 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-catalog-content\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:12 crc kubenswrapper[4711]: I1203 12:39:12.987543 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-utilities\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:13 crc kubenswrapper[4711]: I1203 12:39:13.015640 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8ztz\" (UniqueName: \"kubernetes.io/projected/56322560-09d0-47d3-a596-06956845fcf1-kube-api-access-s8ztz\") pod \"redhat-marketplace-kgqzn\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:13 crc kubenswrapper[4711]: I1203 12:39:13.133480 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:13 crc kubenswrapper[4711]: I1203 12:39:13.346691 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgqzn"] Dec 03 12:39:13 crc kubenswrapper[4711]: W1203 12:39:13.362240 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56322560_09d0_47d3_a596_06956845fcf1.slice/crio-451d2e9acdc75728ebc57cf89f156bce175e173cfe0a120b8ec5bf91f4129136 WatchSource:0}: Error finding container 451d2e9acdc75728ebc57cf89f156bce175e173cfe0a120b8ec5bf91f4129136: Status 404 returned error can't find the container with id 451d2e9acdc75728ebc57cf89f156bce175e173cfe0a120b8ec5bf91f4129136 Dec 03 12:39:14 crc kubenswrapper[4711]: I1203 12:39:14.130224 4711 generic.go:334] "Generic (PLEG): container finished" podID="56322560-09d0-47d3-a596-06956845fcf1" containerID="2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3" exitCode=0 Dec 03 12:39:14 crc kubenswrapper[4711]: I1203 12:39:14.130285 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgqzn" event={"ID":"56322560-09d0-47d3-a596-06956845fcf1","Type":"ContainerDied","Data":"2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3"} Dec 03 12:39:14 crc kubenswrapper[4711]: I1203 12:39:14.130349 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgqzn" event={"ID":"56322560-09d0-47d3-a596-06956845fcf1","Type":"ContainerStarted","Data":"451d2e9acdc75728ebc57cf89f156bce175e173cfe0a120b8ec5bf91f4129136"} Dec 03 12:39:15 crc kubenswrapper[4711]: I1203 12:39:15.139964 4711 generic.go:334] "Generic (PLEG): container finished" podID="56322560-09d0-47d3-a596-06956845fcf1" containerID="5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb" exitCode=0 Dec 03 12:39:15 crc kubenswrapper[4711]: I1203 12:39:15.140026 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgqzn" event={"ID":"56322560-09d0-47d3-a596-06956845fcf1","Type":"ContainerDied","Data":"5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb"} Dec 03 12:39:16 crc kubenswrapper[4711]: I1203 12:39:16.148617 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgqzn" event={"ID":"56322560-09d0-47d3-a596-06956845fcf1","Type":"ContainerStarted","Data":"255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962"} Dec 03 12:39:16 crc kubenswrapper[4711]: I1203 12:39:16.175175 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kgqzn" podStartSLOduration=2.754879832 podStartE2EDuration="4.175153018s" podCreationTimestamp="2025-12-03 12:39:12 +0000 UTC" firstStartedPulling="2025-12-03 12:39:14.133027024 +0000 UTC m=+1472.802278279" lastFinishedPulling="2025-12-03 12:39:15.55330021 +0000 UTC m=+1474.222551465" observedRunningTime="2025-12-03 12:39:16.170416357 +0000 UTC m=+1474.839667612" watchObservedRunningTime="2025-12-03 12:39:16.175153018 +0000 UTC m=+1474.844404283" Dec 03 12:39:23 crc kubenswrapper[4711]: I1203 12:39:23.134018 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:23 crc kubenswrapper[4711]: I1203 12:39:23.134386 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:23 crc kubenswrapper[4711]: I1203 12:39:23.192597 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:23 crc kubenswrapper[4711]: I1203 12:39:23.245890 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:23 crc kubenswrapper[4711]: I1203 12:39:23.426411 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgqzn"] Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.208103 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kgqzn" podUID="56322560-09d0-47d3-a596-06956845fcf1" containerName="registry-server" containerID="cri-o://255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962" gracePeriod=2 Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.595620 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.720050 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-catalog-content\") pod \"56322560-09d0-47d3-a596-06956845fcf1\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.720212 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8ztz\" (UniqueName: \"kubernetes.io/projected/56322560-09d0-47d3-a596-06956845fcf1-kube-api-access-s8ztz\") pod \"56322560-09d0-47d3-a596-06956845fcf1\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.721023 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-utilities\") pod \"56322560-09d0-47d3-a596-06956845fcf1\" (UID: \"56322560-09d0-47d3-a596-06956845fcf1\") " Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.721766 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-utilities" (OuterVolumeSpecName: "utilities") pod "56322560-09d0-47d3-a596-06956845fcf1" (UID: "56322560-09d0-47d3-a596-06956845fcf1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.727226 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56322560-09d0-47d3-a596-06956845fcf1-kube-api-access-s8ztz" (OuterVolumeSpecName: "kube-api-access-s8ztz") pod "56322560-09d0-47d3-a596-06956845fcf1" (UID: "56322560-09d0-47d3-a596-06956845fcf1"). InnerVolumeSpecName "kube-api-access-s8ztz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.743853 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56322560-09d0-47d3-a596-06956845fcf1" (UID: "56322560-09d0-47d3-a596-06956845fcf1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.822017 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.822063 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56322560-09d0-47d3-a596-06956845fcf1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:25 crc kubenswrapper[4711]: I1203 12:39:25.822075 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8ztz\" (UniqueName: \"kubernetes.io/projected/56322560-09d0-47d3-a596-06956845fcf1-kube-api-access-s8ztz\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.214634 4711 generic.go:334] "Generic (PLEG): container finished" podID="56322560-09d0-47d3-a596-06956845fcf1" containerID="255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962" exitCode=0 Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.214677 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgqzn" event={"ID":"56322560-09d0-47d3-a596-06956845fcf1","Type":"ContainerDied","Data":"255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962"} Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.214702 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgqzn" event={"ID":"56322560-09d0-47d3-a596-06956845fcf1","Type":"ContainerDied","Data":"451d2e9acdc75728ebc57cf89f156bce175e173cfe0a120b8ec5bf91f4129136"} Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.214718 4711 scope.go:117] "RemoveContainer" containerID="255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.214820 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgqzn" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.232779 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgqzn"] Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.236363 4711 scope.go:117] "RemoveContainer" containerID="5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.243797 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgqzn"] Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.256193 4711 scope.go:117] "RemoveContainer" containerID="2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.270267 4711 scope.go:117] "RemoveContainer" containerID="255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962" Dec 03 12:39:26 crc kubenswrapper[4711]: E1203 12:39:26.270729 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962\": container with ID starting with 255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962 not found: ID does not exist" containerID="255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.270759 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962"} err="failed to get container status \"255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962\": rpc error: code = NotFound desc = could not find container \"255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962\": container with ID starting with 255617141a3683095e2005040b7e7f6afd331fc327ebb86df9b4999d35e3d962 not found: ID does not exist" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.270781 4711 scope.go:117] "RemoveContainer" containerID="5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb" Dec 03 12:39:26 crc kubenswrapper[4711]: E1203 12:39:26.271040 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb\": container with ID starting with 5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb not found: ID does not exist" containerID="5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.271074 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb"} err="failed to get container status \"5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb\": rpc error: code = NotFound desc = could not find container \"5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb\": container with ID starting with 5c35fc2279895c5ec12525e359ebb23712d86fe5b7ebc5657efb247c146772fb not found: ID does not exist" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.271094 4711 scope.go:117] "RemoveContainer" containerID="2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3" Dec 03 12:39:26 crc kubenswrapper[4711]: E1203 12:39:26.272140 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3\": container with ID starting with 2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3 not found: ID does not exist" containerID="2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3" Dec 03 12:39:26 crc kubenswrapper[4711]: I1203 12:39:26.272174 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3"} err="failed to get container status \"2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3\": rpc error: code = NotFound desc = could not find container \"2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3\": container with ID starting with 2959998c272b7a98fa2c7cc55b1cc8dbe95f663d2ac768efde7cabde6fc816a3 not found: ID does not exist" Dec 03 12:39:27 crc kubenswrapper[4711]: I1203 12:39:27.759377 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-78f9c874c6-vlmhs" Dec 03 12:39:27 crc kubenswrapper[4711]: I1203 12:39:27.827590 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56322560-09d0-47d3-a596-06956845fcf1" path="/var/lib/kubelet/pods/56322560-09d0-47d3-a596-06956845fcf1/volumes" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.460996 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-m42lk"] Dec 03 12:39:28 crc kubenswrapper[4711]: E1203 12:39:28.461657 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56322560-09d0-47d3-a596-06956845fcf1" containerName="extract-utilities" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.461679 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="56322560-09d0-47d3-a596-06956845fcf1" containerName="extract-utilities" Dec 03 12:39:28 crc kubenswrapper[4711]: E1203 12:39:28.461693 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56322560-09d0-47d3-a596-06956845fcf1" containerName="registry-server" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.461701 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="56322560-09d0-47d3-a596-06956845fcf1" containerName="registry-server" Dec 03 12:39:28 crc kubenswrapper[4711]: E1203 12:39:28.461717 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56322560-09d0-47d3-a596-06956845fcf1" containerName="extract-content" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.461726 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="56322560-09d0-47d3-a596-06956845fcf1" containerName="extract-content" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.461846 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="56322560-09d0-47d3-a596-06956845fcf1" containerName="registry-server" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.464166 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.465514 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv"] Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.466329 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.466487 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.466797 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9vvxd" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.466953 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.467765 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.479776 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv"] Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.548166 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-9xmmz"] Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.549447 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.551242 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vvd7g" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.551442 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.551490 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.551750 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.563356 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-sflcc"] Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.564410 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.566021 4711 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.587738 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-sflcc"] Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.657440 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-metrics\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.657719 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.657798 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-sockets\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.657864 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-conf\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.657933 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-reloader\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.657980 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metrics-certs\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.658031 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmgg7\" (UniqueName: \"kubernetes.io/projected/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-kube-api-access-wmgg7\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.658077 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947998b3-f1a3-486a-91b4-108fcc09af6d-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fr4tv\" (UID: \"947998b3-f1a3-486a-91b4-108fcc09af6d\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.658098 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-startup\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.658128 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metallb-excludel2\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.658171 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f567c71f-c1ec-47c6-9173-8c1a29524cf8-metrics-certs\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.658192 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x65f\" (UniqueName: \"kubernetes.io/projected/f567c71f-c1ec-47c6-9173-8c1a29524cf8-kube-api-access-4x65f\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.658210 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrchj\" (UniqueName: \"kubernetes.io/projected/947998b3-f1a3-486a-91b4-108fcc09af6d-kube-api-access-wrchj\") pod \"frr-k8s-webhook-server-7fcb986d4-fr4tv\" (UID: \"947998b3-f1a3-486a-91b4-108fcc09af6d\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759562 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-conf\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759643 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-reloader\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759668 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metrics-certs\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759707 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmgg7\" (UniqueName: \"kubernetes.io/projected/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-kube-api-access-wmgg7\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759738 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947998b3-f1a3-486a-91b4-108fcc09af6d-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fr4tv\" (UID: \"947998b3-f1a3-486a-91b4-108fcc09af6d\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759759 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-startup\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759792 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g58zq\" (UniqueName: \"kubernetes.io/projected/663169ff-9a0b-4c7c-8326-db8528f88c00-kube-api-access-g58zq\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759816 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metallb-excludel2\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759851 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f567c71f-c1ec-47c6-9173-8c1a29524cf8-metrics-certs\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759875 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x65f\" (UniqueName: \"kubernetes.io/projected/f567c71f-c1ec-47c6-9173-8c1a29524cf8-kube-api-access-4x65f\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759928 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrchj\" (UniqueName: \"kubernetes.io/projected/947998b3-f1a3-486a-91b4-108fcc09af6d-kube-api-access-wrchj\") pod \"frr-k8s-webhook-server-7fcb986d4-fr4tv\" (UID: \"947998b3-f1a3-486a-91b4-108fcc09af6d\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759957 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/663169ff-9a0b-4c7c-8326-db8528f88c00-cert\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.759986 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/663169ff-9a0b-4c7c-8326-db8528f88c00-metrics-certs\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.760015 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-metrics\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.760037 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.760065 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-sockets\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.760633 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-sockets\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.760878 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-conf\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.761843 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-metrics\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.761858 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metallb-excludel2\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: E1203 12:39:28.762072 4711 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 12:39:28 crc kubenswrapper[4711]: E1203 12:39:28.762276 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metrics-certs podName:f4c50fcb-c298-476e-b6c6-4af490b8a6ed nodeName:}" failed. No retries permitted until 2025-12-03 12:39:29.262258961 +0000 UTC m=+1487.931510316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metrics-certs") pod "speaker-9xmmz" (UID: "f4c50fcb-c298-476e-b6c6-4af490b8a6ed") : secret "speaker-certs-secret" not found Dec 03 12:39:28 crc kubenswrapper[4711]: E1203 12:39:28.762201 4711 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 12:39:28 crc kubenswrapper[4711]: E1203 12:39:28.762471 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist podName:f4c50fcb-c298-476e-b6c6-4af490b8a6ed nodeName:}" failed. No retries permitted until 2025-12-03 12:39:29.262455396 +0000 UTC m=+1487.931706651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist") pod "speaker-9xmmz" (UID: "f4c50fcb-c298-476e-b6c6-4af490b8a6ed") : secret "metallb-memberlist" not found Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.762134 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f567c71f-c1ec-47c6-9173-8c1a29524cf8-reloader\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.762724 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f567c71f-c1ec-47c6-9173-8c1a29524cf8-frr-startup\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.764924 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f567c71f-c1ec-47c6-9173-8c1a29524cf8-metrics-certs\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.768649 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947998b3-f1a3-486a-91b4-108fcc09af6d-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fr4tv\" (UID: \"947998b3-f1a3-486a-91b4-108fcc09af6d\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.806812 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmgg7\" (UniqueName: \"kubernetes.io/projected/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-kube-api-access-wmgg7\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.810528 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrchj\" (UniqueName: \"kubernetes.io/projected/947998b3-f1a3-486a-91b4-108fcc09af6d-kube-api-access-wrchj\") pod \"frr-k8s-webhook-server-7fcb986d4-fr4tv\" (UID: \"947998b3-f1a3-486a-91b4-108fcc09af6d\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.818531 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x65f\" (UniqueName: \"kubernetes.io/projected/f567c71f-c1ec-47c6-9173-8c1a29524cf8-kube-api-access-4x65f\") pod \"frr-k8s-m42lk\" (UID: \"f567c71f-c1ec-47c6-9173-8c1a29524cf8\") " pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.861022 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g58zq\" (UniqueName: \"kubernetes.io/projected/663169ff-9a0b-4c7c-8326-db8528f88c00-kube-api-access-g58zq\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.861095 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/663169ff-9a0b-4c7c-8326-db8528f88c00-cert\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.861116 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/663169ff-9a0b-4c7c-8326-db8528f88c00-metrics-certs\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.864108 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/663169ff-9a0b-4c7c-8326-db8528f88c00-metrics-certs\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.864460 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/663169ff-9a0b-4c7c-8326-db8528f88c00-cert\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.878252 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g58zq\" (UniqueName: \"kubernetes.io/projected/663169ff-9a0b-4c7c-8326-db8528f88c00-kube-api-access-g58zq\") pod \"controller-f8648f98b-sflcc\" (UID: \"663169ff-9a0b-4c7c-8326-db8528f88c00\") " pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:28 crc kubenswrapper[4711]: I1203 12:39:28.891237 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:29 crc kubenswrapper[4711]: I1203 12:39:29.078861 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:29 crc kubenswrapper[4711]: I1203 12:39:29.086738 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:29 crc kubenswrapper[4711]: I1203 12:39:29.239377 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerStarted","Data":"4ebf92f9d7768ff8959850bd2737a223c4ff9c090c95c4041fb049db88dddb59"} Dec 03 12:39:29 crc kubenswrapper[4711]: I1203 12:39:29.265138 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:29 crc kubenswrapper[4711]: I1203 12:39:29.265232 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metrics-certs\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:29 crc kubenswrapper[4711]: E1203 12:39:29.266524 4711 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 12:39:29 crc kubenswrapper[4711]: E1203 12:39:29.266605 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist podName:f4c50fcb-c298-476e-b6c6-4af490b8a6ed nodeName:}" failed. No retries permitted until 2025-12-03 12:39:30.266581856 +0000 UTC m=+1488.935833171 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist") pod "speaker-9xmmz" (UID: "f4c50fcb-c298-476e-b6c6-4af490b8a6ed") : secret "metallb-memberlist" not found Dec 03 12:39:29 crc kubenswrapper[4711]: I1203 12:39:29.273030 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-metrics-certs\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:29 crc kubenswrapper[4711]: I1203 12:39:29.279846 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-sflcc"] Dec 03 12:39:29 crc kubenswrapper[4711]: I1203 12:39:29.280060 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv"] Dec 03 12:39:29 crc kubenswrapper[4711]: W1203 12:39:29.293069 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod947998b3_f1a3_486a_91b4_108fcc09af6d.slice/crio-cb74d95ac9644b4d2a000489aee7e9abcbda2d723907905e74bd20765cfade2d WatchSource:0}: Error finding container cb74d95ac9644b4d2a000489aee7e9abcbda2d723907905e74bd20765cfade2d: Status 404 returned error can't find the container with id cb74d95ac9644b4d2a000489aee7e9abcbda2d723907905e74bd20765cfade2d Dec 03 12:39:30 crc kubenswrapper[4711]: I1203 12:39:30.251484 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" event={"ID":"947998b3-f1a3-486a-91b4-108fcc09af6d","Type":"ContainerStarted","Data":"cb74d95ac9644b4d2a000489aee7e9abcbda2d723907905e74bd20765cfade2d"} Dec 03 12:39:30 crc kubenswrapper[4711]: I1203 12:39:30.253554 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-sflcc" event={"ID":"663169ff-9a0b-4c7c-8326-db8528f88c00","Type":"ContainerStarted","Data":"d153d87ca1e6de7fb2c6314d7fa034cfbb3abc83ab191a29641966a9d8eb9fba"} Dec 03 12:39:30 crc kubenswrapper[4711]: I1203 12:39:30.253579 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-sflcc" event={"ID":"663169ff-9a0b-4c7c-8326-db8528f88c00","Type":"ContainerStarted","Data":"65f3235ab64b6643fe24792f760985b9224690f7b36c0e1eb7be4932fa10e8cb"} Dec 03 12:39:30 crc kubenswrapper[4711]: I1203 12:39:30.278954 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:30 crc kubenswrapper[4711]: I1203 12:39:30.288823 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f4c50fcb-c298-476e-b6c6-4af490b8a6ed-memberlist\") pod \"speaker-9xmmz\" (UID: \"f4c50fcb-c298-476e-b6c6-4af490b8a6ed\") " pod="metallb-system/speaker-9xmmz" Dec 03 12:39:30 crc kubenswrapper[4711]: I1203 12:39:30.366010 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9xmmz" Dec 03 12:39:30 crc kubenswrapper[4711]: W1203 12:39:30.420214 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c50fcb_c298_476e_b6c6_4af490b8a6ed.slice/crio-c6db13924b4a9e0624c8ce47a7d58d39680a4ba3ddffa25a8388bc61ba3cecca WatchSource:0}: Error finding container c6db13924b4a9e0624c8ce47a7d58d39680a4ba3ddffa25a8388bc61ba3cecca: Status 404 returned error can't find the container with id c6db13924b4a9e0624c8ce47a7d58d39680a4ba3ddffa25a8388bc61ba3cecca Dec 03 12:39:31 crc kubenswrapper[4711]: I1203 12:39:31.260767 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9xmmz" event={"ID":"f4c50fcb-c298-476e-b6c6-4af490b8a6ed","Type":"ContainerStarted","Data":"a1a6b7aeea3d1881add2eb935f5c5fc0bfff160faf33533c7f43db38c9152eea"} Dec 03 12:39:31 crc kubenswrapper[4711]: I1203 12:39:31.260809 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9xmmz" event={"ID":"f4c50fcb-c298-476e-b6c6-4af490b8a6ed","Type":"ContainerStarted","Data":"c6db13924b4a9e0624c8ce47a7d58d39680a4ba3ddffa25a8388bc61ba3cecca"} Dec 03 12:39:33 crc kubenswrapper[4711]: I1203 12:39:33.279420 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-sflcc" event={"ID":"663169ff-9a0b-4c7c-8326-db8528f88c00","Type":"ContainerStarted","Data":"c0483ebb02604c5a013f054ab13e47b0452e94c177a30814f31470d7c94a0402"} Dec 03 12:39:33 crc kubenswrapper[4711]: I1203 12:39:33.280763 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:33 crc kubenswrapper[4711]: I1203 12:39:33.290360 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9xmmz" event={"ID":"f4c50fcb-c298-476e-b6c6-4af490b8a6ed","Type":"ContainerStarted","Data":"adaed58160e1d83cc07e7af9cdc7d30c9a3fc5be5187bbe49fedf187335a207e"} Dec 03 12:39:33 crc kubenswrapper[4711]: I1203 12:39:33.292309 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-9xmmz" Dec 03 12:39:33 crc kubenswrapper[4711]: I1203 12:39:33.303481 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-sflcc" podStartSLOduration=1.882954742 podStartE2EDuration="5.303458958s" podCreationTimestamp="2025-12-03 12:39:28 +0000 UTC" firstStartedPulling="2025-12-03 12:39:29.413168161 +0000 UTC m=+1488.082419426" lastFinishedPulling="2025-12-03 12:39:32.833672377 +0000 UTC m=+1491.502923642" observedRunningTime="2025-12-03 12:39:33.301407282 +0000 UTC m=+1491.970658567" watchObservedRunningTime="2025-12-03 12:39:33.303458958 +0000 UTC m=+1491.972710213" Dec 03 12:39:33 crc kubenswrapper[4711]: I1203 12:39:33.326570 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-9xmmz" podStartSLOduration=3.264028817 podStartE2EDuration="5.326542296s" podCreationTimestamp="2025-12-03 12:39:28 +0000 UTC" firstStartedPulling="2025-12-03 12:39:30.78498445 +0000 UTC m=+1489.454235695" lastFinishedPulling="2025-12-03 12:39:32.847497919 +0000 UTC m=+1491.516749174" observedRunningTime="2025-12-03 12:39:33.319862501 +0000 UTC m=+1491.989113776" watchObservedRunningTime="2025-12-03 12:39:33.326542296 +0000 UTC m=+1491.995793551" Dec 03 12:39:35 crc kubenswrapper[4711]: I1203 12:39:35.402018 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:39:35 crc kubenswrapper[4711]: I1203 12:39:35.402105 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:39:37 crc kubenswrapper[4711]: I1203 12:39:37.328217 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" event={"ID":"947998b3-f1a3-486a-91b4-108fcc09af6d","Type":"ContainerStarted","Data":"dd776fb48b6f5f1b9609b4f0e03f75afdb4b50d5e4b034cf9e7f40cba82ecdab"} Dec 03 12:39:37 crc kubenswrapper[4711]: I1203 12:39:37.328799 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:37 crc kubenswrapper[4711]: I1203 12:39:37.330113 4711 generic.go:334] "Generic (PLEG): container finished" podID="f567c71f-c1ec-47c6-9173-8c1a29524cf8" containerID="391b17386f40f2e5a52642370cd88b6a32b0b609697269a6d5ae1796787ae5e9" exitCode=0 Dec 03 12:39:37 crc kubenswrapper[4711]: I1203 12:39:37.330149 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerDied","Data":"391b17386f40f2e5a52642370cd88b6a32b0b609697269a6d5ae1796787ae5e9"} Dec 03 12:39:37 crc kubenswrapper[4711]: I1203 12:39:37.349261 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" podStartSLOduration=2.177943702 podStartE2EDuration="9.349236197s" podCreationTimestamp="2025-12-03 12:39:28 +0000 UTC" firstStartedPulling="2025-12-03 12:39:29.295840744 +0000 UTC m=+1487.965091999" lastFinishedPulling="2025-12-03 12:39:36.467133239 +0000 UTC m=+1495.136384494" observedRunningTime="2025-12-03 12:39:37.345289268 +0000 UTC m=+1496.014540563" watchObservedRunningTime="2025-12-03 12:39:37.349236197 +0000 UTC m=+1496.018487462" Dec 03 12:39:38 crc kubenswrapper[4711]: I1203 12:39:38.338220 4711 generic.go:334] "Generic (PLEG): container finished" podID="f567c71f-c1ec-47c6-9173-8c1a29524cf8" containerID="cceefb79bcee43ff9fa218fa2c13259347005c65bac6c5b65c3538f920dfad16" exitCode=0 Dec 03 12:39:38 crc kubenswrapper[4711]: I1203 12:39:38.338293 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerDied","Data":"cceefb79bcee43ff9fa218fa2c13259347005c65bac6c5b65c3538f920dfad16"} Dec 03 12:39:39 crc kubenswrapper[4711]: I1203 12:39:39.346417 4711 generic.go:334] "Generic (PLEG): container finished" podID="f567c71f-c1ec-47c6-9173-8c1a29524cf8" containerID="5c7a39f9a2754de4a9fd7f93469c0bc48342beea44f63fd8a0e7d8d3856d7304" exitCode=0 Dec 03 12:39:39 crc kubenswrapper[4711]: I1203 12:39:39.346463 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerDied","Data":"5c7a39f9a2754de4a9fd7f93469c0bc48342beea44f63fd8a0e7d8d3856d7304"} Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.356219 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerStarted","Data":"e8af8ec4a5e0307aef2d76f28eeebc28d1e1908d329dc23ec304fa6a261d11be"} Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.356795 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.356811 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerStarted","Data":"fcca6dda782837fe5c788b41e31cf118361298d7021d2a87c588e898e1afdbe4"} Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.356824 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerStarted","Data":"4f4991d5b926236fd3d7a11b674b8850572a9eae3c11708fc160082e4bdcc943"} Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.356836 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerStarted","Data":"05f762a6591c7a252c2157d26f87114ad3ea2b75593d33e172f7d01d618cee7d"} Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.356847 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerStarted","Data":"9fece0b19e20d88581bad3f56aa3a2a59e90906e6be5888a9e9a629acc04b9a9"} Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.356856 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m42lk" event={"ID":"f567c71f-c1ec-47c6-9173-8c1a29524cf8","Type":"ContainerStarted","Data":"b61a28006420b4425c0458d97454cf3c2f9b0dc6d2841aa06affd70f1faba4d2"} Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.369265 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-9xmmz" Dec 03 12:39:40 crc kubenswrapper[4711]: I1203 12:39:40.421724 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-m42lk" podStartSLOduration=5.147594079 podStartE2EDuration="12.42170627s" podCreationTimestamp="2025-12-03 12:39:28 +0000 UTC" firstStartedPulling="2025-12-03 12:39:29.211875657 +0000 UTC m=+1487.881126912" lastFinishedPulling="2025-12-03 12:39:36.485987838 +0000 UTC m=+1495.155239103" observedRunningTime="2025-12-03 12:39:40.392589327 +0000 UTC m=+1499.061840592" watchObservedRunningTime="2025-12-03 12:39:40.42170627 +0000 UTC m=+1499.090957525" Dec 03 12:39:44 crc kubenswrapper[4711]: I1203 12:39:44.079272 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:44 crc kubenswrapper[4711]: I1203 12:39:44.118000 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.569250 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-sgfrm"] Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.570304 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-sgfrm" Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.581992 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.582089 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-26s22" Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.582007 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.598291 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-sgfrm"] Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.705524 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzvg\" (UniqueName: \"kubernetes.io/projected/af299464-3fa1-4ecd-9f88-3dd118b756cb-kube-api-access-wxzvg\") pod \"mariadb-operator-index-sgfrm\" (UID: \"af299464-3fa1-4ecd-9f88-3dd118b756cb\") " pod="openstack-operators/mariadb-operator-index-sgfrm" Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.806547 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzvg\" (UniqueName: \"kubernetes.io/projected/af299464-3fa1-4ecd-9f88-3dd118b756cb-kube-api-access-wxzvg\") pod \"mariadb-operator-index-sgfrm\" (UID: \"af299464-3fa1-4ecd-9f88-3dd118b756cb\") " pod="openstack-operators/mariadb-operator-index-sgfrm" Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.825049 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzvg\" (UniqueName: \"kubernetes.io/projected/af299464-3fa1-4ecd-9f88-3dd118b756cb-kube-api-access-wxzvg\") pod \"mariadb-operator-index-sgfrm\" (UID: \"af299464-3fa1-4ecd-9f88-3dd118b756cb\") " pod="openstack-operators/mariadb-operator-index-sgfrm" Dec 03 12:39:46 crc kubenswrapper[4711]: I1203 12:39:46.903098 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-sgfrm" Dec 03 12:39:47 crc kubenswrapper[4711]: I1203 12:39:47.326771 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-sgfrm"] Dec 03 12:39:47 crc kubenswrapper[4711]: W1203 12:39:47.331386 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf299464_3fa1_4ecd_9f88_3dd118b756cb.slice/crio-6af259f9029fd1cbe9d4c55f9f2251d7baa1c2aabf041d3a5aef958963f2312f WatchSource:0}: Error finding container 6af259f9029fd1cbe9d4c55f9f2251d7baa1c2aabf041d3a5aef958963f2312f: Status 404 returned error can't find the container with id 6af259f9029fd1cbe9d4c55f9f2251d7baa1c2aabf041d3a5aef958963f2312f Dec 03 12:39:47 crc kubenswrapper[4711]: I1203 12:39:47.398266 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-sgfrm" event={"ID":"af299464-3fa1-4ecd-9f88-3dd118b756cb","Type":"ContainerStarted","Data":"6af259f9029fd1cbe9d4c55f9f2251d7baa1c2aabf041d3a5aef958963f2312f"} Dec 03 12:39:48 crc kubenswrapper[4711]: I1203 12:39:48.895668 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-sflcc" Dec 03 12:39:49 crc kubenswrapper[4711]: I1203 12:39:49.082163 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-m42lk" Dec 03 12:39:49 crc kubenswrapper[4711]: I1203 12:39:49.090480 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fr4tv" Dec 03 12:39:49 crc kubenswrapper[4711]: I1203 12:39:49.421921 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-sgfrm" event={"ID":"af299464-3fa1-4ecd-9f88-3dd118b756cb","Type":"ContainerStarted","Data":"0e4d6fa39588a16a174683c4a8d7be6f4e72ad43b4d318922da75b437b4656ca"} Dec 03 12:39:49 crc kubenswrapper[4711]: I1203 12:39:49.435272 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-sgfrm" podStartSLOduration=2.375548476 podStartE2EDuration="3.435247396s" podCreationTimestamp="2025-12-03 12:39:46 +0000 UTC" firstStartedPulling="2025-12-03 12:39:47.338263277 +0000 UTC m=+1506.007514552" lastFinishedPulling="2025-12-03 12:39:48.397962217 +0000 UTC m=+1507.067213472" observedRunningTime="2025-12-03 12:39:49.43397761 +0000 UTC m=+1508.103228905" watchObservedRunningTime="2025-12-03 12:39:49.435247396 +0000 UTC m=+1508.104498661" Dec 03 12:39:49 crc kubenswrapper[4711]: I1203 12:39:49.947635 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-sgfrm"] Dec 03 12:39:50 crc kubenswrapper[4711]: I1203 12:39:50.553329 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-5qvn8"] Dec 03 12:39:50 crc kubenswrapper[4711]: I1203 12:39:50.554208 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:39:50 crc kubenswrapper[4711]: I1203 12:39:50.565426 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-5qvn8"] Dec 03 12:39:50 crc kubenswrapper[4711]: I1203 12:39:50.656866 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9j4\" (UniqueName: \"kubernetes.io/projected/a5448fb5-04ac-4514-b7f3-39e3c17a10cb-kube-api-access-xd9j4\") pod \"mariadb-operator-index-5qvn8\" (UID: \"a5448fb5-04ac-4514-b7f3-39e3c17a10cb\") " pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:39:50 crc kubenswrapper[4711]: I1203 12:39:50.758524 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9j4\" (UniqueName: \"kubernetes.io/projected/a5448fb5-04ac-4514-b7f3-39e3c17a10cb-kube-api-access-xd9j4\") pod \"mariadb-operator-index-5qvn8\" (UID: \"a5448fb5-04ac-4514-b7f3-39e3c17a10cb\") " pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:39:50 crc kubenswrapper[4711]: I1203 12:39:50.781322 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9j4\" (UniqueName: \"kubernetes.io/projected/a5448fb5-04ac-4514-b7f3-39e3c17a10cb-kube-api-access-xd9j4\") pod \"mariadb-operator-index-5qvn8\" (UID: \"a5448fb5-04ac-4514-b7f3-39e3c17a10cb\") " pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:39:50 crc kubenswrapper[4711]: I1203 12:39:50.875187 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:39:51 crc kubenswrapper[4711]: I1203 12:39:51.065012 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-5qvn8"] Dec 03 12:39:51 crc kubenswrapper[4711]: I1203 12:39:51.433640 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-sgfrm" podUID="af299464-3fa1-4ecd-9f88-3dd118b756cb" containerName="registry-server" containerID="cri-o://0e4d6fa39588a16a174683c4a8d7be6f4e72ad43b4d318922da75b437b4656ca" gracePeriod=2 Dec 03 12:39:51 crc kubenswrapper[4711]: I1203 12:39:51.434185 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-5qvn8" event={"ID":"a5448fb5-04ac-4514-b7f3-39e3c17a10cb","Type":"ContainerStarted","Data":"865dfdba6ef30597ebf8fcdb0ef9c6e302947d8079c0f20e407b18c703441350"} Dec 03 12:39:52 crc kubenswrapper[4711]: I1203 12:39:52.438487 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-5qvn8" event={"ID":"a5448fb5-04ac-4514-b7f3-39e3c17a10cb","Type":"ContainerStarted","Data":"5745afd087a2675655c63fcaaaf69fbcb39f61ee594935c73e7c0061e2c3767c"} Dec 03 12:39:52 crc kubenswrapper[4711]: I1203 12:39:52.440030 4711 generic.go:334] "Generic (PLEG): container finished" podID="af299464-3fa1-4ecd-9f88-3dd118b756cb" containerID="0e4d6fa39588a16a174683c4a8d7be6f4e72ad43b4d318922da75b437b4656ca" exitCode=0 Dec 03 12:39:52 crc kubenswrapper[4711]: I1203 12:39:52.440065 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-sgfrm" event={"ID":"af299464-3fa1-4ecd-9f88-3dd118b756cb","Type":"ContainerDied","Data":"0e4d6fa39588a16a174683c4a8d7be6f4e72ad43b4d318922da75b437b4656ca"} Dec 03 12:39:52 crc kubenswrapper[4711]: I1203 12:39:52.451474 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-5qvn8" podStartSLOduration=1.297903039 podStartE2EDuration="2.451457227s" podCreationTimestamp="2025-12-03 12:39:50 +0000 UTC" firstStartedPulling="2025-12-03 12:39:51.074583427 +0000 UTC m=+1509.743834682" lastFinishedPulling="2025-12-03 12:39:52.228137615 +0000 UTC m=+1510.897388870" observedRunningTime="2025-12-03 12:39:52.450109719 +0000 UTC m=+1511.119360994" watchObservedRunningTime="2025-12-03 12:39:52.451457227 +0000 UTC m=+1511.120708482" Dec 03 12:39:52 crc kubenswrapper[4711]: I1203 12:39:52.928653 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-sgfrm" Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.093492 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxzvg\" (UniqueName: \"kubernetes.io/projected/af299464-3fa1-4ecd-9f88-3dd118b756cb-kube-api-access-wxzvg\") pod \"af299464-3fa1-4ecd-9f88-3dd118b756cb\" (UID: \"af299464-3fa1-4ecd-9f88-3dd118b756cb\") " Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.106241 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af299464-3fa1-4ecd-9f88-3dd118b756cb-kube-api-access-wxzvg" (OuterVolumeSpecName: "kube-api-access-wxzvg") pod "af299464-3fa1-4ecd-9f88-3dd118b756cb" (UID: "af299464-3fa1-4ecd-9f88-3dd118b756cb"). InnerVolumeSpecName "kube-api-access-wxzvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.196508 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxzvg\" (UniqueName: \"kubernetes.io/projected/af299464-3fa1-4ecd-9f88-3dd118b756cb-kube-api-access-wxzvg\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.449279 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-sgfrm" event={"ID":"af299464-3fa1-4ecd-9f88-3dd118b756cb","Type":"ContainerDied","Data":"6af259f9029fd1cbe9d4c55f9f2251d7baa1c2aabf041d3a5aef958963f2312f"} Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.449335 4711 scope.go:117] "RemoveContainer" containerID="0e4d6fa39588a16a174683c4a8d7be6f4e72ad43b4d318922da75b437b4656ca" Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.449334 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-sgfrm" Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.496103 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-sgfrm"] Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.504216 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-sgfrm"] Dec 03 12:39:53 crc kubenswrapper[4711]: I1203 12:39:53.825554 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af299464-3fa1-4ecd-9f88-3dd118b756cb" path="/var/lib/kubelet/pods/af299464-3fa1-4ecd-9f88-3dd118b756cb/volumes" Dec 03 12:40:00 crc kubenswrapper[4711]: I1203 12:40:00.876236 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:40:00 crc kubenswrapper[4711]: I1203 12:40:00.877091 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:40:00 crc kubenswrapper[4711]: I1203 12:40:00.928355 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:40:01 crc kubenswrapper[4711]: I1203 12:40:01.540409 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-5qvn8" Dec 03 12:40:05 crc kubenswrapper[4711]: I1203 12:40:05.402160 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:40:05 crc kubenswrapper[4711]: I1203 12:40:05.403720 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:40:07 crc kubenswrapper[4711]: I1203 12:40:07.947382 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d"] Dec 03 12:40:07 crc kubenswrapper[4711]: E1203 12:40:07.948170 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af299464-3fa1-4ecd-9f88-3dd118b756cb" containerName="registry-server" Dec 03 12:40:07 crc kubenswrapper[4711]: I1203 12:40:07.948186 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="af299464-3fa1-4ecd-9f88-3dd118b756cb" containerName="registry-server" Dec 03 12:40:07 crc kubenswrapper[4711]: I1203 12:40:07.948301 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="af299464-3fa1-4ecd-9f88-3dd118b756cb" containerName="registry-server" Dec 03 12:40:07 crc kubenswrapper[4711]: I1203 12:40:07.949297 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:07 crc kubenswrapper[4711]: I1203 12:40:07.952880 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gw8hz" Dec 03 12:40:07 crc kubenswrapper[4711]: I1203 12:40:07.963562 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d"] Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.097613 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-bundle\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.098074 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbsr\" (UniqueName: \"kubernetes.io/projected/d27600e8-c4f5-416a-8138-51c87928cbdf-kube-api-access-jtbsr\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.098226 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-util\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.199367 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-util\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.199447 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-bundle\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.199529 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbsr\" (UniqueName: \"kubernetes.io/projected/d27600e8-c4f5-416a-8138-51c87928cbdf-kube-api-access-jtbsr\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.200153 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-bundle\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.200166 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-util\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.223514 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbsr\" (UniqueName: \"kubernetes.io/projected/d27600e8-c4f5-416a-8138-51c87928cbdf-kube-api-access-jtbsr\") pod \"55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.267949 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:08 crc kubenswrapper[4711]: I1203 12:40:08.687380 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d"] Dec 03 12:40:08 crc kubenswrapper[4711]: W1203 12:40:08.698184 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27600e8_c4f5_416a_8138_51c87928cbdf.slice/crio-23bb8577756224b9d36b030837c44a413293abd2fef585abbff5eaa8e94af192 WatchSource:0}: Error finding container 23bb8577756224b9d36b030837c44a413293abd2fef585abbff5eaa8e94af192: Status 404 returned error can't find the container with id 23bb8577756224b9d36b030837c44a413293abd2fef585abbff5eaa8e94af192 Dec 03 12:40:09 crc kubenswrapper[4711]: I1203 12:40:09.566495 4711 generic.go:334] "Generic (PLEG): container finished" podID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerID="9aac5b3f338805a103f88271aead064f427b070a0dd69e6403358e261497e3ad" exitCode=0 Dec 03 12:40:09 crc kubenswrapper[4711]: I1203 12:40:09.566546 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" event={"ID":"d27600e8-c4f5-416a-8138-51c87928cbdf","Type":"ContainerDied","Data":"9aac5b3f338805a103f88271aead064f427b070a0dd69e6403358e261497e3ad"} Dec 03 12:40:09 crc kubenswrapper[4711]: I1203 12:40:09.566573 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" event={"ID":"d27600e8-c4f5-416a-8138-51c87928cbdf","Type":"ContainerStarted","Data":"23bb8577756224b9d36b030837c44a413293abd2fef585abbff5eaa8e94af192"} Dec 03 12:40:10 crc kubenswrapper[4711]: I1203 12:40:10.574455 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" event={"ID":"d27600e8-c4f5-416a-8138-51c87928cbdf","Type":"ContainerStarted","Data":"54da4ee63989007aa04aff42a2022bdea8157f5ac496bcaedd73f85356fb7c1f"} Dec 03 12:40:11 crc kubenswrapper[4711]: I1203 12:40:11.584535 4711 generic.go:334] "Generic (PLEG): container finished" podID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerID="54da4ee63989007aa04aff42a2022bdea8157f5ac496bcaedd73f85356fb7c1f" exitCode=0 Dec 03 12:40:11 crc kubenswrapper[4711]: I1203 12:40:11.584612 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" event={"ID":"d27600e8-c4f5-416a-8138-51c87928cbdf","Type":"ContainerDied","Data":"54da4ee63989007aa04aff42a2022bdea8157f5ac496bcaedd73f85356fb7c1f"} Dec 03 12:40:12 crc kubenswrapper[4711]: I1203 12:40:12.591273 4711 generic.go:334] "Generic (PLEG): container finished" podID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerID="9755d3b30d839e32fb4801448c6b05bc9b58362585a72853c827400b54056d36" exitCode=0 Dec 03 12:40:12 crc kubenswrapper[4711]: I1203 12:40:12.591365 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" event={"ID":"d27600e8-c4f5-416a-8138-51c87928cbdf","Type":"ContainerDied","Data":"9755d3b30d839e32fb4801448c6b05bc9b58362585a72853c827400b54056d36"} Dec 03 12:40:13 crc kubenswrapper[4711]: I1203 12:40:13.802708 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:13 crc kubenswrapper[4711]: I1203 12:40:13.976263 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-util\") pod \"d27600e8-c4f5-416a-8138-51c87928cbdf\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " Dec 03 12:40:13 crc kubenswrapper[4711]: I1203 12:40:13.976710 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtbsr\" (UniqueName: \"kubernetes.io/projected/d27600e8-c4f5-416a-8138-51c87928cbdf-kube-api-access-jtbsr\") pod \"d27600e8-c4f5-416a-8138-51c87928cbdf\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " Dec 03 12:40:13 crc kubenswrapper[4711]: I1203 12:40:13.976755 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-bundle\") pod \"d27600e8-c4f5-416a-8138-51c87928cbdf\" (UID: \"d27600e8-c4f5-416a-8138-51c87928cbdf\") " Dec 03 12:40:13 crc kubenswrapper[4711]: I1203 12:40:13.977957 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-bundle" (OuterVolumeSpecName: "bundle") pod "d27600e8-c4f5-416a-8138-51c87928cbdf" (UID: "d27600e8-c4f5-416a-8138-51c87928cbdf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:40:13 crc kubenswrapper[4711]: I1203 12:40:13.982918 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27600e8-c4f5-416a-8138-51c87928cbdf-kube-api-access-jtbsr" (OuterVolumeSpecName: "kube-api-access-jtbsr") pod "d27600e8-c4f5-416a-8138-51c87928cbdf" (UID: "d27600e8-c4f5-416a-8138-51c87928cbdf"). InnerVolumeSpecName "kube-api-access-jtbsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:40:14 crc kubenswrapper[4711]: I1203 12:40:14.002872 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-util" (OuterVolumeSpecName: "util") pod "d27600e8-c4f5-416a-8138-51c87928cbdf" (UID: "d27600e8-c4f5-416a-8138-51c87928cbdf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:40:14 crc kubenswrapper[4711]: I1203 12:40:14.078534 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtbsr\" (UniqueName: \"kubernetes.io/projected/d27600e8-c4f5-416a-8138-51c87928cbdf-kube-api-access-jtbsr\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:14 crc kubenswrapper[4711]: I1203 12:40:14.078591 4711 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:14 crc kubenswrapper[4711]: I1203 12:40:14.078614 4711 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d27600e8-c4f5-416a-8138-51c87928cbdf-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:14 crc kubenswrapper[4711]: I1203 12:40:14.603834 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" event={"ID":"d27600e8-c4f5-416a-8138-51c87928cbdf","Type":"ContainerDied","Data":"23bb8577756224b9d36b030837c44a413293abd2fef585abbff5eaa8e94af192"} Dec 03 12:40:14 crc kubenswrapper[4711]: I1203 12:40:14.603880 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23bb8577756224b9d36b030837c44a413293abd2fef585abbff5eaa8e94af192" Dec 03 12:40:14 crc kubenswrapper[4711]: I1203 12:40:14.603889 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.843830 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq"] Dec 03 12:40:20 crc kubenswrapper[4711]: E1203 12:40:20.844612 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerName="pull" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.844629 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerName="pull" Dec 03 12:40:20 crc kubenswrapper[4711]: E1203 12:40:20.844640 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerName="util" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.844648 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerName="util" Dec 03 12:40:20 crc kubenswrapper[4711]: E1203 12:40:20.844661 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerName="extract" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.844670 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerName="extract" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.844773 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="d27600e8-c4f5-416a-8138-51c87928cbdf" containerName="extract" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.845204 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.848428 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.848609 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.849228 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-dl2h9" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.861051 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq"] Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.964709 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49682020-cbd0-4b76-941d-2e0ae637db0b-apiservice-cert\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.965677 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49682020-cbd0-4b76-941d-2e0ae637db0b-webhook-cert\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:20 crc kubenswrapper[4711]: I1203 12:40:20.965771 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4fz\" (UniqueName: \"kubernetes.io/projected/49682020-cbd0-4b76-941d-2e0ae637db0b-kube-api-access-rg4fz\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.066822 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49682020-cbd0-4b76-941d-2e0ae637db0b-apiservice-cert\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.066898 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49682020-cbd0-4b76-941d-2e0ae637db0b-webhook-cert\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.066934 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4fz\" (UniqueName: \"kubernetes.io/projected/49682020-cbd0-4b76-941d-2e0ae637db0b-kube-api-access-rg4fz\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.073045 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49682020-cbd0-4b76-941d-2e0ae637db0b-webhook-cert\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.076770 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49682020-cbd0-4b76-941d-2e0ae637db0b-apiservice-cert\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.084520 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4fz\" (UniqueName: \"kubernetes.io/projected/49682020-cbd0-4b76-941d-2e0ae637db0b-kube-api-access-rg4fz\") pod \"mariadb-operator-controller-manager-69f7cc4dcf-wgllq\" (UID: \"49682020-cbd0-4b76-941d-2e0ae637db0b\") " pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.163364 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.400644 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq"] Dec 03 12:40:21 crc kubenswrapper[4711]: I1203 12:40:21.645610 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" event={"ID":"49682020-cbd0-4b76-941d-2e0ae637db0b","Type":"ContainerStarted","Data":"b3890e0cc8d22d98bef7db4d9c11ce068f119de23935cd49ce8485df0aa3aca1"} Dec 03 12:40:25 crc kubenswrapper[4711]: I1203 12:40:25.669887 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" event={"ID":"49682020-cbd0-4b76-941d-2e0ae637db0b","Type":"ContainerStarted","Data":"e5c8f5ab07dc699ed7d970bd1ea8bb9c492e5d0aee21289d3be13ce1e6d744fe"} Dec 03 12:40:25 crc kubenswrapper[4711]: I1203 12:40:25.670436 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:25 crc kubenswrapper[4711]: I1203 12:40:25.690877 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" podStartSLOduration=1.9498839829999999 podStartE2EDuration="5.690857551s" podCreationTimestamp="2025-12-03 12:40:20 +0000 UTC" firstStartedPulling="2025-12-03 12:40:21.408916197 +0000 UTC m=+1540.078167452" lastFinishedPulling="2025-12-03 12:40:25.149889765 +0000 UTC m=+1543.819141020" observedRunningTime="2025-12-03 12:40:25.688886076 +0000 UTC m=+1544.358137351" watchObservedRunningTime="2025-12-03 12:40:25.690857551 +0000 UTC m=+1544.360108806" Dec 03 12:40:31 crc kubenswrapper[4711]: I1203 12:40:31.169834 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-69f7cc4dcf-wgllq" Dec 03 12:40:35 crc kubenswrapper[4711]: I1203 12:40:35.402120 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:40:35 crc kubenswrapper[4711]: I1203 12:40:35.402704 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:40:35 crc kubenswrapper[4711]: I1203 12:40:35.402767 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:40:35 crc kubenswrapper[4711]: I1203 12:40:35.403510 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:40:35 crc kubenswrapper[4711]: I1203 12:40:35.403567 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" gracePeriod=600 Dec 03 12:40:36 crc kubenswrapper[4711]: E1203 12:40:36.049883 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.282281 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" exitCode=0 Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.282333 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f"} Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.282372 4711 scope.go:117] "RemoveContainer" containerID="c2221b1d8dd35869dac84859caa67252140cee66906c26379e7532c6528f6458" Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.283008 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:40:36 crc kubenswrapper[4711]: E1203 12:40:36.283200 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.625321 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-rmc57"] Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.626212 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.628617 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-6nqjc" Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.639828 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-rmc57"] Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.814325 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9nhx\" (UniqueName: \"kubernetes.io/projected/b7bb5832-bc73-4090-ac27-fb8809668c72-kube-api-access-r9nhx\") pod \"infra-operator-index-rmc57\" (UID: \"b7bb5832-bc73-4090-ac27-fb8809668c72\") " pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.915659 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9nhx\" (UniqueName: \"kubernetes.io/projected/b7bb5832-bc73-4090-ac27-fb8809668c72-kube-api-access-r9nhx\") pod \"infra-operator-index-rmc57\" (UID: \"b7bb5832-bc73-4090-ac27-fb8809668c72\") " pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.934229 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9nhx\" (UniqueName: \"kubernetes.io/projected/b7bb5832-bc73-4090-ac27-fb8809668c72-kube-api-access-r9nhx\") pod \"infra-operator-index-rmc57\" (UID: \"b7bb5832-bc73-4090-ac27-fb8809668c72\") " pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:40:36 crc kubenswrapper[4711]: I1203 12:40:36.987790 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:40:37 crc kubenswrapper[4711]: I1203 12:40:37.203481 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-rmc57"] Dec 03 12:40:37 crc kubenswrapper[4711]: W1203 12:40:37.209660 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7bb5832_bc73_4090_ac27_fb8809668c72.slice/crio-37bea28d83620f1869e085391489675609e6319509875782ea8224665a4a6b11 WatchSource:0}: Error finding container 37bea28d83620f1869e085391489675609e6319509875782ea8224665a4a6b11: Status 404 returned error can't find the container with id 37bea28d83620f1869e085391489675609e6319509875782ea8224665a4a6b11 Dec 03 12:40:37 crc kubenswrapper[4711]: I1203 12:40:37.291821 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-rmc57" event={"ID":"b7bb5832-bc73-4090-ac27-fb8809668c72","Type":"ContainerStarted","Data":"37bea28d83620f1869e085391489675609e6319509875782ea8224665a4a6b11"} Dec 03 12:40:45 crc kubenswrapper[4711]: I1203 12:40:45.350515 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-rmc57" event={"ID":"b7bb5832-bc73-4090-ac27-fb8809668c72","Type":"ContainerStarted","Data":"b72fd4c14284a0ed19a0f6a5ceaa467b93f179ac35980199d1c93eede071f84b"} Dec 03 12:40:45 crc kubenswrapper[4711]: I1203 12:40:45.373793 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-rmc57" podStartSLOduration=1.506948062 podStartE2EDuration="9.373770437s" podCreationTimestamp="2025-12-03 12:40:36 +0000 UTC" firstStartedPulling="2025-12-03 12:40:37.21621285 +0000 UTC m=+1555.885464095" lastFinishedPulling="2025-12-03 12:40:45.083035185 +0000 UTC m=+1563.752286470" observedRunningTime="2025-12-03 12:40:45.367127244 +0000 UTC m=+1564.036378539" watchObservedRunningTime="2025-12-03 12:40:45.373770437 +0000 UTC m=+1564.043021702" Dec 03 12:40:46 crc kubenswrapper[4711]: I1203 12:40:46.988729 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:40:46 crc kubenswrapper[4711]: I1203 12:40:46.989562 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:40:47 crc kubenswrapper[4711]: I1203 12:40:47.028777 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:40:48 crc kubenswrapper[4711]: I1203 12:40:48.816862 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:40:48 crc kubenswrapper[4711]: E1203 12:40:48.817477 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:40:57 crc kubenswrapper[4711]: I1203 12:40:57.024348 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-rmc57" Dec 03 12:41:00 crc kubenswrapper[4711]: I1203 12:41:00.817467 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:41:00 crc kubenswrapper[4711]: E1203 12:41:00.817946 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.336154 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz"] Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.338537 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.341074 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gw8hz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.346445 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz"] Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.351666 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vz5c\" (UniqueName: \"kubernetes.io/projected/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-kube-api-access-8vz5c\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.374559 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.374739 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.476359 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.476438 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vz5c\" (UniqueName: \"kubernetes.io/projected/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-kube-api-access-8vz5c\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.476490 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.476983 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.477009 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.495781 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vz5c\" (UniqueName: \"kubernetes.io/projected/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-kube-api-access-8vz5c\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:07 crc kubenswrapper[4711]: I1203 12:41:07.676832 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:08 crc kubenswrapper[4711]: I1203 12:41:08.115745 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz"] Dec 03 12:41:08 crc kubenswrapper[4711]: W1203 12:41:08.127667 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b765e7_1caf_4b36_98c9_8a5d5c0b88b3.slice/crio-0b7dcbba53bd8f7f73c3bfe8f44077e57f067389b84ac6c51b2ab7a53f3664fd WatchSource:0}: Error finding container 0b7dcbba53bd8f7f73c3bfe8f44077e57f067389b84ac6c51b2ab7a53f3664fd: Status 404 returned error can't find the container with id 0b7dcbba53bd8f7f73c3bfe8f44077e57f067389b84ac6c51b2ab7a53f3664fd Dec 03 12:41:08 crc kubenswrapper[4711]: I1203 12:41:08.507559 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" event={"ID":"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3","Type":"ContainerStarted","Data":"0b7dcbba53bd8f7f73c3bfe8f44077e57f067389b84ac6c51b2ab7a53f3664fd"} Dec 03 12:41:13 crc kubenswrapper[4711]: I1203 12:41:13.547823 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" event={"ID":"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3","Type":"ContainerStarted","Data":"a984fa49952846200e3a0d03a71410f1493d298822bccbc51932bac147593346"} Dec 03 12:41:13 crc kubenswrapper[4711]: I1203 12:41:13.817607 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:41:13 crc kubenswrapper[4711]: E1203 12:41:13.817870 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:41:14 crc kubenswrapper[4711]: I1203 12:41:14.554074 4711 generic.go:334] "Generic (PLEG): container finished" podID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerID="a984fa49952846200e3a0d03a71410f1493d298822bccbc51932bac147593346" exitCode=0 Dec 03 12:41:14 crc kubenswrapper[4711]: I1203 12:41:14.554194 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" event={"ID":"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3","Type":"ContainerDied","Data":"a984fa49952846200e3a0d03a71410f1493d298822bccbc51932bac147593346"} Dec 03 12:41:15 crc kubenswrapper[4711]: I1203 12:41:15.561518 4711 generic.go:334] "Generic (PLEG): container finished" podID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerID="be05009c257abc8945b4fdb874a7a5bf33c3519c2b31d57836239e30c07f7145" exitCode=0 Dec 03 12:41:15 crc kubenswrapper[4711]: I1203 12:41:15.561604 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" event={"ID":"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3","Type":"ContainerDied","Data":"be05009c257abc8945b4fdb874a7a5bf33c3519c2b31d57836239e30c07f7145"} Dec 03 12:41:16 crc kubenswrapper[4711]: I1203 12:41:16.573528 4711 generic.go:334] "Generic (PLEG): container finished" podID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerID="664124271e969e42204febacf84d723a22703169dc9ddc23f3171fe89243a619" exitCode=0 Dec 03 12:41:16 crc kubenswrapper[4711]: I1203 12:41:16.573661 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" event={"ID":"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3","Type":"ContainerDied","Data":"664124271e969e42204febacf84d723a22703169dc9ddc23f3171fe89243a619"} Dec 03 12:41:17 crc kubenswrapper[4711]: I1203 12:41:17.836073 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:17 crc kubenswrapper[4711]: I1203 12:41:17.937235 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vz5c\" (UniqueName: \"kubernetes.io/projected/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-kube-api-access-8vz5c\") pod \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " Dec 03 12:41:17 crc kubenswrapper[4711]: I1203 12:41:17.937305 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-bundle\") pod \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " Dec 03 12:41:17 crc kubenswrapper[4711]: I1203 12:41:17.937395 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-util\") pod \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\" (UID: \"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3\") " Dec 03 12:41:17 crc kubenswrapper[4711]: I1203 12:41:17.939146 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-bundle" (OuterVolumeSpecName: "bundle") pod "c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" (UID: "c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:17 crc kubenswrapper[4711]: I1203 12:41:17.943861 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-kube-api-access-8vz5c" (OuterVolumeSpecName: "kube-api-access-8vz5c") pod "c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" (UID: "c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3"). InnerVolumeSpecName "kube-api-access-8vz5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:41:18 crc kubenswrapper[4711]: I1203 12:41:18.039095 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vz5c\" (UniqueName: \"kubernetes.io/projected/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-kube-api-access-8vz5c\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:18 crc kubenswrapper[4711]: I1203 12:41:18.039128 4711 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:18 crc kubenswrapper[4711]: I1203 12:41:18.165613 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-util" (OuterVolumeSpecName: "util") pod "c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" (UID: "c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:18 crc kubenswrapper[4711]: I1203 12:41:18.241946 4711 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:18 crc kubenswrapper[4711]: I1203 12:41:18.601727 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" event={"ID":"c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3","Type":"ContainerDied","Data":"0b7dcbba53bd8f7f73c3bfe8f44077e57f067389b84ac6c51b2ab7a53f3664fd"} Dec 03 12:41:18 crc kubenswrapper[4711]: I1203 12:41:18.602245 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b7dcbba53bd8f7f73c3bfe8f44077e57f067389b84ac6c51b2ab7a53f3664fd" Dec 03 12:41:18 crc kubenswrapper[4711]: I1203 12:41:18.601815 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.814489 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp"] Dec 03 12:41:25 crc kubenswrapper[4711]: E1203 12:41:25.815323 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerName="pull" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.815340 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerName="pull" Dec 03 12:41:25 crc kubenswrapper[4711]: E1203 12:41:25.815377 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerName="extract" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.815386 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerName="extract" Dec 03 12:41:25 crc kubenswrapper[4711]: E1203 12:41:25.815401 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerName="util" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.815409 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerName="util" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.815524 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3" containerName="extract" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.816266 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.818691 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9n64n" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.818866 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.845114 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp"] Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.946082 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9799l\" (UniqueName: \"kubernetes.io/projected/07432873-5f89-4207-bc9f-d93994d12733-kube-api-access-9799l\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.946147 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/07432873-5f89-4207-bc9f-d93994d12733-apiservice-cert\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:25 crc kubenswrapper[4711]: I1203 12:41:25.946172 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/07432873-5f89-4207-bc9f-d93994d12733-webhook-cert\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.047994 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9799l\" (UniqueName: \"kubernetes.io/projected/07432873-5f89-4207-bc9f-d93994d12733-kube-api-access-9799l\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.048212 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/07432873-5f89-4207-bc9f-d93994d12733-apiservice-cert\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.048302 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/07432873-5f89-4207-bc9f-d93994d12733-webhook-cert\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.054800 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/07432873-5f89-4207-bc9f-d93994d12733-webhook-cert\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.059503 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/07432873-5f89-4207-bc9f-d93994d12733-apiservice-cert\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.064706 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9799l\" (UniqueName: \"kubernetes.io/projected/07432873-5f89-4207-bc9f-d93994d12733-kube-api-access-9799l\") pod \"infra-operator-controller-manager-56b7d7797c-sljsp\" (UID: \"07432873-5f89-4207-bc9f-d93994d12733\") " pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.140996 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.338853 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp"] Dec 03 12:41:26 crc kubenswrapper[4711]: W1203 12:41:26.345597 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07432873_5f89_4207_bc9f_d93994d12733.slice/crio-fb04884fcd68afbf5aa151b6a4b08e3962c9e2af85cadc692bc8e42642e8785a WatchSource:0}: Error finding container fb04884fcd68afbf5aa151b6a4b08e3962c9e2af85cadc692bc8e42642e8785a: Status 404 returned error can't find the container with id fb04884fcd68afbf5aa151b6a4b08e3962c9e2af85cadc692bc8e42642e8785a Dec 03 12:41:26 crc kubenswrapper[4711]: I1203 12:41:26.656018 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" event={"ID":"07432873-5f89-4207-bc9f-d93994d12733","Type":"ContainerStarted","Data":"fb04884fcd68afbf5aa151b6a4b08e3962c9e2af85cadc692bc8e42642e8785a"} Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.031695 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.032880 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.046208 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-kcpqt" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.046431 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.046711 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.046878 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.047100 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.065741 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.067095 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.070311 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.073151 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.083393 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.100699 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.104963 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.159984 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-config-data-default\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.160031 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-kolla-config\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.160072 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tnq6\" (UniqueName: \"kubernetes.io/projected/9afca0df-f7de-4a6f-88bf-49378019f63d-kube-api-access-5tnq6\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.160256 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9afca0df-f7de-4a6f-88bf-49378019f63d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.160346 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.160549 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.261923 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.261999 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-operator-scripts\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262034 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-config-data-default\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262057 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262076 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-kolla-config\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262109 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4p8\" (UniqueName: \"kubernetes.io/projected/66535e59-1359-4bb1-bc04-7cf81d7fedc6-kube-api-access-gq4p8\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262180 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-config-data-default\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262203 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tnq6\" (UniqueName: \"kubernetes.io/projected/9afca0df-f7de-4a6f-88bf-49378019f63d-kube-api-access-5tnq6\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262223 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-config-data-default\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262247 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66535e59-1359-4bb1-bc04-7cf81d7fedc6-config-data-generated\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262280 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkqfb\" (UniqueName: \"kubernetes.io/projected/31b86704-7f22-45ea-994f-26a4f953e56b-kube-api-access-wkqfb\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262305 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-kolla-config\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262330 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31b86704-7f22-45ea-994f-26a4f953e56b-config-data-generated\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262355 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-operator-scripts\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262385 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9afca0df-f7de-4a6f-88bf-49378019f63d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262408 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-kolla-config\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262432 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262466 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.262775 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.263563 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-config-data-default\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.263705 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-kolla-config\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.263843 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9afca0df-f7de-4a6f-88bf-49378019f63d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.264867 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9afca0df-f7de-4a6f-88bf-49378019f63d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.282151 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tnq6\" (UniqueName: \"kubernetes.io/projected/9afca0df-f7de-4a6f-88bf-49378019f63d-kube-api-access-5tnq6\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.287152 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"9afca0df-f7de-4a6f-88bf-49378019f63d\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.351189 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.364495 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4p8\" (UniqueName: \"kubernetes.io/projected/66535e59-1359-4bb1-bc04-7cf81d7fedc6-kube-api-access-gq4p8\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.364707 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-config-data-default\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.364739 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-config-data-default\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.364794 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66535e59-1359-4bb1-bc04-7cf81d7fedc6-config-data-generated\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.364822 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkqfb\" (UniqueName: \"kubernetes.io/projected/31b86704-7f22-45ea-994f-26a4f953e56b-kube-api-access-wkqfb\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.364877 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-kolla-config\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.365982 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66535e59-1359-4bb1-bc04-7cf81d7fedc6-config-data-generated\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.366971 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-config-data-default\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367382 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-config-data-default\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367457 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-kolla-config\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367467 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31b86704-7f22-45ea-994f-26a4f953e56b-config-data-generated\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367514 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-operator-scripts\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367545 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-kolla-config\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367603 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367637 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-operator-scripts\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367666 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.367805 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.368058 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31b86704-7f22-45ea-994f-26a4f953e56b-config-data-generated\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.368424 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.368856 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-kolla-config\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.369724 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31b86704-7f22-45ea-994f-26a4f953e56b-operator-scripts\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.370454 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66535e59-1359-4bb1-bc04-7cf81d7fedc6-operator-scripts\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.383373 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkqfb\" (UniqueName: \"kubernetes.io/projected/31b86704-7f22-45ea-994f-26a4f953e56b-kube-api-access-wkqfb\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.386135 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.396491 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4p8\" (UniqueName: \"kubernetes.io/projected/66535e59-1359-4bb1-bc04-7cf81d7fedc6-kube-api-access-gq4p8\") pod \"openstack-galera-1\" (UID: \"66535e59-1359-4bb1-bc04-7cf81d7fedc6\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.418289 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"31b86704-7f22-45ea-994f-26a4f953e56b\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.686426 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.697972 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:27 crc kubenswrapper[4711]: I1203 12:41:27.820115 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:41:27 crc kubenswrapper[4711]: E1203 12:41:27.820419 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:41:28 crc kubenswrapper[4711]: I1203 12:41:28.399434 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 03 12:41:28 crc kubenswrapper[4711]: I1203 12:41:28.584542 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 03 12:41:28 crc kubenswrapper[4711]: I1203 12:41:28.645271 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 03 12:41:28 crc kubenswrapper[4711]: W1203 12:41:28.647711 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9afca0df_f7de_4a6f_88bf_49378019f63d.slice/crio-0adc0d8398663d95bc3620dd7b85ca46f83d83f39974dd7b4a2063905b57075c WatchSource:0}: Error finding container 0adc0d8398663d95bc3620dd7b85ca46f83d83f39974dd7b4a2063905b57075c: Status 404 returned error can't find the container with id 0adc0d8398663d95bc3620dd7b85ca46f83d83f39974dd7b4a2063905b57075c Dec 03 12:41:28 crc kubenswrapper[4711]: I1203 12:41:28.669040 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"9afca0df-f7de-4a6f-88bf-49378019f63d","Type":"ContainerStarted","Data":"0adc0d8398663d95bc3620dd7b85ca46f83d83f39974dd7b4a2063905b57075c"} Dec 03 12:41:28 crc kubenswrapper[4711]: I1203 12:41:28.670900 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" event={"ID":"07432873-5f89-4207-bc9f-d93994d12733","Type":"ContainerStarted","Data":"406a8600cdd94eb7b72809f6fab0d9db900286a784e8b37eb2121643630a75f9"} Dec 03 12:41:28 crc kubenswrapper[4711]: I1203 12:41:28.671829 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"66535e59-1359-4bb1-bc04-7cf81d7fedc6","Type":"ContainerStarted","Data":"858dbb932993f3cf91c54fdf33802d078238801ec0f457689cd6b7e21f51b2a0"} Dec 03 12:41:28 crc kubenswrapper[4711]: I1203 12:41:28.672594 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"31b86704-7f22-45ea-994f-26a4f953e56b","Type":"ContainerStarted","Data":"07845bbbf15c94a21ddf5552293bccfa75641a61017223ddb5332accb29ca4c2"} Dec 03 12:41:36 crc kubenswrapper[4711]: I1203 12:41:36.734770 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"9afca0df-f7de-4a6f-88bf-49378019f63d","Type":"ContainerStarted","Data":"15a25800e52a13b31a0230e59caf886133d27193a4928dd9da4971e604caf3ae"} Dec 03 12:41:36 crc kubenswrapper[4711]: I1203 12:41:36.737075 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" event={"ID":"07432873-5f89-4207-bc9f-d93994d12733","Type":"ContainerStarted","Data":"a771aab5fbed5ea28555412e182a99fd2379fe4ae59df643dbdac25afb3ad48b"} Dec 03 12:41:36 crc kubenswrapper[4711]: I1203 12:41:36.737186 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:36 crc kubenswrapper[4711]: I1203 12:41:36.738824 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"66535e59-1359-4bb1-bc04-7cf81d7fedc6","Type":"ContainerStarted","Data":"9573ac292fff6dae9630d1c3f2627cc6c1fb69d86d6a435a1adfffec599b9f55"} Dec 03 12:41:36 crc kubenswrapper[4711]: I1203 12:41:36.740149 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"31b86704-7f22-45ea-994f-26a4f953e56b","Type":"ContainerStarted","Data":"53bc348cc900b49888632a2d3484e8a624996d52eaa8398e04a5f0e82aa62cc6"} Dec 03 12:41:36 crc kubenswrapper[4711]: I1203 12:41:36.742255 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" Dec 03 12:41:36 crc kubenswrapper[4711]: I1203 12:41:36.864284 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-56b7d7797c-sljsp" podStartSLOduration=1.9719609519999999 podStartE2EDuration="11.864261963s" podCreationTimestamp="2025-12-03 12:41:25 +0000 UTC" firstStartedPulling="2025-12-03 12:41:26.347962376 +0000 UTC m=+1605.017213631" lastFinishedPulling="2025-12-03 12:41:36.240263387 +0000 UTC m=+1614.909514642" observedRunningTime="2025-12-03 12:41:36.844247051 +0000 UTC m=+1615.513498326" watchObservedRunningTime="2025-12-03 12:41:36.864261963 +0000 UTC m=+1615.533513218" Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.765932 4711 generic.go:334] "Generic (PLEG): container finished" podID="66535e59-1359-4bb1-bc04-7cf81d7fedc6" containerID="9573ac292fff6dae9630d1c3f2627cc6c1fb69d86d6a435a1adfffec599b9f55" exitCode=0 Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.766063 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"66535e59-1359-4bb1-bc04-7cf81d7fedc6","Type":"ContainerDied","Data":"9573ac292fff6dae9630d1c3f2627cc6c1fb69d86d6a435a1adfffec599b9f55"} Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.767884 4711 generic.go:334] "Generic (PLEG): container finished" podID="31b86704-7f22-45ea-994f-26a4f953e56b" containerID="53bc348cc900b49888632a2d3484e8a624996d52eaa8398e04a5f0e82aa62cc6" exitCode=0 Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.767959 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"31b86704-7f22-45ea-994f-26a4f953e56b","Type":"ContainerDied","Data":"53bc348cc900b49888632a2d3484e8a624996d52eaa8398e04a5f0e82aa62cc6"} Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.771095 4711 generic.go:334] "Generic (PLEG): container finished" podID="9afca0df-f7de-4a6f-88bf-49378019f63d" containerID="15a25800e52a13b31a0230e59caf886133d27193a4928dd9da4971e604caf3ae" exitCode=0 Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.771158 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"9afca0df-f7de-4a6f-88bf-49378019f63d","Type":"ContainerDied","Data":"15a25800e52a13b31a0230e59caf886133d27193a4928dd9da4971e604caf3ae"} Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.817404 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:41:40 crc kubenswrapper[4711]: E1203 12:41:40.817605 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.918336 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.919235 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.921028 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-224jg" Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.921043 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Dec 03 12:41:40 crc kubenswrapper[4711]: I1203 12:41:40.937685 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.028462 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8p6l\" (UniqueName: \"kubernetes.io/projected/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-kube-api-access-v8p6l\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.028527 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-kolla-config\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.028553 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-config-data\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.131107 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8p6l\" (UniqueName: \"kubernetes.io/projected/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-kube-api-access-v8p6l\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.131447 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-kolla-config\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.131486 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-config-data\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.132220 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-kolla-config\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.132329 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-config-data\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.152891 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8p6l\" (UniqueName: \"kubernetes.io/projected/8e6f76a1-658a-47a6-a675-0130bb6cc7a8-kube-api-access-v8p6l\") pod \"memcached-0\" (UID: \"8e6f76a1-658a-47a6-a675-0130bb6cc7a8\") " pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.264013 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.477711 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 03 12:41:41 crc kubenswrapper[4711]: W1203 12:41:41.485400 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e6f76a1_658a_47a6_a675_0130bb6cc7a8.slice/crio-f19b85d3368be4845209e65678dd90cd32c9e8085eba6d61777f22c0f18109e5 WatchSource:0}: Error finding container f19b85d3368be4845209e65678dd90cd32c9e8085eba6d61777f22c0f18109e5: Status 404 returned error can't find the container with id f19b85d3368be4845209e65678dd90cd32c9e8085eba6d61777f22c0f18109e5 Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.799651 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"31b86704-7f22-45ea-994f-26a4f953e56b","Type":"ContainerStarted","Data":"267a8ddca536bee4cf1f9f657f2291f19c9d6fba3449d13976366bcdf16106b6"} Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.802290 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"9afca0df-f7de-4a6f-88bf-49378019f63d","Type":"ContainerStarted","Data":"09052d501de470d82e5431a3d1649b9c006f5d6f157b7095f40cd88480a2d69c"} Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.803146 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"8e6f76a1-658a-47a6-a675-0130bb6cc7a8","Type":"ContainerStarted","Data":"f19b85d3368be4845209e65678dd90cd32c9e8085eba6d61777f22c0f18109e5"} Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.808996 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"66535e59-1359-4bb1-bc04-7cf81d7fedc6","Type":"ContainerStarted","Data":"81f4012a77a3fd9c6d4fc595b50de8fe8a5f7f26ae110fe86a3e36b18c4865b9"} Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.838082 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=8.068903737 podStartE2EDuration="15.838065547s" podCreationTimestamp="2025-12-03 12:41:26 +0000 UTC" firstStartedPulling="2025-12-03 12:41:28.612005604 +0000 UTC m=+1607.281256859" lastFinishedPulling="2025-12-03 12:41:36.381167404 +0000 UTC m=+1615.050418669" observedRunningTime="2025-12-03 12:41:41.836038791 +0000 UTC m=+1620.505290056" watchObservedRunningTime="2025-12-03 12:41:41.838065547 +0000 UTC m=+1620.507316802" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.887015 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=7.951819436 podStartE2EDuration="15.886992017s" podCreationTimestamp="2025-12-03 12:41:26 +0000 UTC" firstStartedPulling="2025-12-03 12:41:28.411691917 +0000 UTC m=+1607.080943172" lastFinishedPulling="2025-12-03 12:41:36.346864478 +0000 UTC m=+1615.016115753" observedRunningTime="2025-12-03 12:41:41.884786427 +0000 UTC m=+1620.554037712" watchObservedRunningTime="2025-12-03 12:41:41.886992017 +0000 UTC m=+1620.556243272" Dec 03 12:41:41 crc kubenswrapper[4711]: I1203 12:41:41.925276 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=8.223954745 podStartE2EDuration="15.925256553s" podCreationTimestamp="2025-12-03 12:41:26 +0000 UTC" firstStartedPulling="2025-12-03 12:41:28.653743506 +0000 UTC m=+1607.322994761" lastFinishedPulling="2025-12-03 12:41:36.355045314 +0000 UTC m=+1615.024296569" observedRunningTime="2025-12-03 12:41:41.922216999 +0000 UTC m=+1620.591468264" watchObservedRunningTime="2025-12-03 12:41:41.925256553 +0000 UTC m=+1620.594507818" Dec 03 12:41:43 crc kubenswrapper[4711]: I1203 12:41:43.786734 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-x9l2v"] Dec 03 12:41:43 crc kubenswrapper[4711]: I1203 12:41:43.787703 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:41:43 crc kubenswrapper[4711]: I1203 12:41:43.790585 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-jw8lh" Dec 03 12:41:43 crc kubenswrapper[4711]: I1203 12:41:43.831029 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-x9l2v"] Dec 03 12:41:43 crc kubenswrapper[4711]: I1203 12:41:43.894667 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9gnh\" (UniqueName: \"kubernetes.io/projected/c3f55cbd-074a-4b43-a557-d6c5ce1ada0d-kube-api-access-t9gnh\") pod \"rabbitmq-cluster-operator-index-x9l2v\" (UID: \"c3f55cbd-074a-4b43-a557-d6c5ce1ada0d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:41:43 crc kubenswrapper[4711]: I1203 12:41:43.995269 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9gnh\" (UniqueName: \"kubernetes.io/projected/c3f55cbd-074a-4b43-a557-d6c5ce1ada0d-kube-api-access-t9gnh\") pod \"rabbitmq-cluster-operator-index-x9l2v\" (UID: \"c3f55cbd-074a-4b43-a557-d6c5ce1ada0d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:41:44 crc kubenswrapper[4711]: I1203 12:41:44.026838 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9gnh\" (UniqueName: \"kubernetes.io/projected/c3f55cbd-074a-4b43-a557-d6c5ce1ada0d-kube-api-access-t9gnh\") pod \"rabbitmq-cluster-operator-index-x9l2v\" (UID: \"c3f55cbd-074a-4b43-a557-d6c5ce1ada0d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:41:44 crc kubenswrapper[4711]: I1203 12:41:44.108005 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:41:44 crc kubenswrapper[4711]: I1203 12:41:44.357971 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-x9l2v"] Dec 03 12:41:44 crc kubenswrapper[4711]: W1203 12:41:44.375178 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3f55cbd_074a_4b43_a557_d6c5ce1ada0d.slice/crio-b28d492c43e18f41a1d20b651998fe0ccc4575e89103f72aec0d8e8ecf974d1e WatchSource:0}: Error finding container b28d492c43e18f41a1d20b651998fe0ccc4575e89103f72aec0d8e8ecf974d1e: Status 404 returned error can't find the container with id b28d492c43e18f41a1d20b651998fe0ccc4575e89103f72aec0d8e8ecf974d1e Dec 03 12:41:44 crc kubenswrapper[4711]: I1203 12:41:44.894239 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" event={"ID":"c3f55cbd-074a-4b43-a557-d6c5ce1ada0d","Type":"ContainerStarted","Data":"b28d492c43e18f41a1d20b651998fe0ccc4575e89103f72aec0d8e8ecf974d1e"} Dec 03 12:41:47 crc kubenswrapper[4711]: I1203 12:41:47.351835 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:47 crc kubenswrapper[4711]: I1203 12:41:47.352219 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:41:47 crc kubenswrapper[4711]: I1203 12:41:47.687371 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:47 crc kubenswrapper[4711]: I1203 12:41:47.687440 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:41:47 crc kubenswrapper[4711]: I1203 12:41:47.699134 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:47 crc kubenswrapper[4711]: I1203 12:41:47.699190 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:52 crc kubenswrapper[4711]: I1203 12:41:52.970564 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"8e6f76a1-658a-47a6-a675-0130bb6cc7a8","Type":"ContainerStarted","Data":"3c90cffccdcea62d7021a47c3f73d0b811ed9370086fdb86dbf7f7552dc8e017"} Dec 03 12:41:52 crc kubenswrapper[4711]: I1203 12:41:52.971240 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:52 crc kubenswrapper[4711]: I1203 12:41:52.993974 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=2.681263512 podStartE2EDuration="12.993947471s" podCreationTimestamp="2025-12-03 12:41:40 +0000 UTC" firstStartedPulling="2025-12-03 12:41:41.48803721 +0000 UTC m=+1620.157288465" lastFinishedPulling="2025-12-03 12:41:51.800721149 +0000 UTC m=+1630.469972424" observedRunningTime="2025-12-03 12:41:52.988725837 +0000 UTC m=+1631.657977142" watchObservedRunningTime="2025-12-03 12:41:52.993947471 +0000 UTC m=+1631.663198726" Dec 03 12:41:53 crc kubenswrapper[4711]: I1203 12:41:53.796342 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:53 crc kubenswrapper[4711]: I1203 12:41:53.817471 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:41:53 crc kubenswrapper[4711]: E1203 12:41:53.817755 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:41:53 crc kubenswrapper[4711]: I1203 12:41:53.871932 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Dec 03 12:41:54 crc kubenswrapper[4711]: E1203 12:41:54.016826 4711 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.38:59038->38.102.83.38:38069: read tcp 38.102.83.38:59038->38.102.83.38:38069: read: connection reset by peer Dec 03 12:41:54 crc kubenswrapper[4711]: E1203 12:41:54.016857 4711 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.38:59038->38.102.83.38:38069: write tcp 38.102.83.38:59038->38.102.83.38:38069: write: broken pipe Dec 03 12:41:54 crc kubenswrapper[4711]: I1203 12:41:54.984447 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" event={"ID":"c3f55cbd-074a-4b43-a557-d6c5ce1ada0d","Type":"ContainerStarted","Data":"221c31b20cb9b5d878d5cce9cdfde331c67e3b95a86dffcd490e6955363fb2e3"} Dec 03 12:41:55 crc kubenswrapper[4711]: I1203 12:41:55.009030 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" podStartSLOduration=2.282176654 podStartE2EDuration="12.008989929s" podCreationTimestamp="2025-12-03 12:41:43 +0000 UTC" firstStartedPulling="2025-12-03 12:41:44.383262782 +0000 UTC m=+1623.052514037" lastFinishedPulling="2025-12-03 12:41:54.110076057 +0000 UTC m=+1632.779327312" observedRunningTime="2025-12-03 12:41:55.00252263 +0000 UTC m=+1633.671773895" watchObservedRunningTime="2025-12-03 12:41:55.008989929 +0000 UTC m=+1633.678241194" Dec 03 12:41:56 crc kubenswrapper[4711]: I1203 12:41:56.268126 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Dec 03 12:41:57 crc kubenswrapper[4711]: I1203 12:41:57.766984 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="31b86704-7f22-45ea-994f-26a4f953e56b" containerName="galera" probeResult="failure" output=< Dec 03 12:41:57 crc kubenswrapper[4711]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Dec 03 12:41:57 crc kubenswrapper[4711]: > Dec 03 12:42:04 crc kubenswrapper[4711]: I1203 12:42:04.108546 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:42:04 crc kubenswrapper[4711]: I1203 12:42:04.109037 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:42:04 crc kubenswrapper[4711]: I1203 12:42:04.132129 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:42:05 crc kubenswrapper[4711]: I1203 12:42:05.081561 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-x9l2v" Dec 03 12:42:06 crc kubenswrapper[4711]: I1203 12:42:06.852380 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:42:06 crc kubenswrapper[4711]: I1203 12:42:06.929678 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Dec 03 12:42:07 crc kubenswrapper[4711]: I1203 12:42:07.817407 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:42:07 crc kubenswrapper[4711]: E1203 12:42:07.817982 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:42:10 crc kubenswrapper[4711]: I1203 12:42:10.830773 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:42:10 crc kubenswrapper[4711]: I1203 12:42:10.896791 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.428224 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2"] Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.445600 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.449827 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gw8hz" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.456272 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2"] Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.546286 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4ddc\" (UniqueName: \"kubernetes.io/projected/84c113ac-ae1d-427b-892c-64fca086fb54-kube-api-access-v4ddc\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.546375 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.546654 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.647650 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.647699 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4ddc\" (UniqueName: \"kubernetes.io/projected/84c113ac-ae1d-427b-892c-64fca086fb54-kube-api-access-v4ddc\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.647737 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.648193 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.648403 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.671256 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4ddc\" (UniqueName: \"kubernetes.io/projected/84c113ac-ae1d-427b-892c-64fca086fb54-kube-api-access-v4ddc\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.780051 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:13 crc kubenswrapper[4711]: I1203 12:42:13.981215 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2"] Dec 03 12:42:14 crc kubenswrapper[4711]: I1203 12:42:14.113593 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" event={"ID":"84c113ac-ae1d-427b-892c-64fca086fb54","Type":"ContainerStarted","Data":"b4218e4348a5ccce56b4a3f7271c09caebfa021a5b93f38b092ed74698215e65"} Dec 03 12:42:15 crc kubenswrapper[4711]: I1203 12:42:15.124333 4711 generic.go:334] "Generic (PLEG): container finished" podID="84c113ac-ae1d-427b-892c-64fca086fb54" containerID="c189990c58becafa8d3375f1cf47bb79d0f55e016ce5fb35ea8142a07451e702" exitCode=0 Dec 03 12:42:15 crc kubenswrapper[4711]: I1203 12:42:15.124411 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" event={"ID":"84c113ac-ae1d-427b-892c-64fca086fb54","Type":"ContainerDied","Data":"c189990c58becafa8d3375f1cf47bb79d0f55e016ce5fb35ea8142a07451e702"} Dec 03 12:42:17 crc kubenswrapper[4711]: I1203 12:42:17.139765 4711 generic.go:334] "Generic (PLEG): container finished" podID="84c113ac-ae1d-427b-892c-64fca086fb54" containerID="a66b4edb990d3c658a7245cadc14817538b9c6e139951609edbc570575870205" exitCode=0 Dec 03 12:42:17 crc kubenswrapper[4711]: I1203 12:42:17.139859 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" event={"ID":"84c113ac-ae1d-427b-892c-64fca086fb54","Type":"ContainerDied","Data":"a66b4edb990d3c658a7245cadc14817538b9c6e139951609edbc570575870205"} Dec 03 12:42:18 crc kubenswrapper[4711]: I1203 12:42:18.161042 4711 generic.go:334] "Generic (PLEG): container finished" podID="84c113ac-ae1d-427b-892c-64fca086fb54" containerID="28d692a82f1541be0629bd9fa1d19ba4d8b66a26ed69e34b8f7b4bd49386e4ee" exitCode=0 Dec 03 12:42:18 crc kubenswrapper[4711]: I1203 12:42:18.161173 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" event={"ID":"84c113ac-ae1d-427b-892c-64fca086fb54","Type":"ContainerDied","Data":"28d692a82f1541be0629bd9fa1d19ba4d8b66a26ed69e34b8f7b4bd49386e4ee"} Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.442637 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.456104 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4ddc\" (UniqueName: \"kubernetes.io/projected/84c113ac-ae1d-427b-892c-64fca086fb54-kube-api-access-v4ddc\") pod \"84c113ac-ae1d-427b-892c-64fca086fb54\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.456157 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-util\") pod \"84c113ac-ae1d-427b-892c-64fca086fb54\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.456227 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-bundle\") pod \"84c113ac-ae1d-427b-892c-64fca086fb54\" (UID: \"84c113ac-ae1d-427b-892c-64fca086fb54\") " Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.460245 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-bundle" (OuterVolumeSpecName: "bundle") pod "84c113ac-ae1d-427b-892c-64fca086fb54" (UID: "84c113ac-ae1d-427b-892c-64fca086fb54"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.463527 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c113ac-ae1d-427b-892c-64fca086fb54-kube-api-access-v4ddc" (OuterVolumeSpecName: "kube-api-access-v4ddc") pod "84c113ac-ae1d-427b-892c-64fca086fb54" (UID: "84c113ac-ae1d-427b-892c-64fca086fb54"). InnerVolumeSpecName "kube-api-access-v4ddc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.557730 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4ddc\" (UniqueName: \"kubernetes.io/projected/84c113ac-ae1d-427b-892c-64fca086fb54-kube-api-access-v4ddc\") on node \"crc\" DevicePath \"\"" Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.557770 4711 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.861806 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-util" (OuterVolumeSpecName: "util") pod "84c113ac-ae1d-427b-892c-64fca086fb54" (UID: "84c113ac-ae1d-427b-892c-64fca086fb54"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:42:19 crc kubenswrapper[4711]: I1203 12:42:19.961781 4711 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c113ac-ae1d-427b-892c-64fca086fb54-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:42:20 crc kubenswrapper[4711]: I1203 12:42:20.176632 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" event={"ID":"84c113ac-ae1d-427b-892c-64fca086fb54","Type":"ContainerDied","Data":"b4218e4348a5ccce56b4a3f7271c09caebfa021a5b93f38b092ed74698215e65"} Dec 03 12:42:20 crc kubenswrapper[4711]: I1203 12:42:20.176666 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4218e4348a5ccce56b4a3f7271c09caebfa021a5b93f38b092ed74698215e65" Dec 03 12:42:20 crc kubenswrapper[4711]: I1203 12:42:20.176694 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2" Dec 03 12:42:20 crc kubenswrapper[4711]: I1203 12:42:20.816772 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:42:20 crc kubenswrapper[4711]: E1203 12:42:20.817353 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.411092 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk"] Dec 03 12:42:25 crc kubenswrapper[4711]: E1203 12:42:25.411735 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c113ac-ae1d-427b-892c-64fca086fb54" containerName="pull" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.411753 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c113ac-ae1d-427b-892c-64fca086fb54" containerName="pull" Dec 03 12:42:25 crc kubenswrapper[4711]: E1203 12:42:25.411772 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c113ac-ae1d-427b-892c-64fca086fb54" containerName="util" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.411780 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c113ac-ae1d-427b-892c-64fca086fb54" containerName="util" Dec 03 12:42:25 crc kubenswrapper[4711]: E1203 12:42:25.411798 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c113ac-ae1d-427b-892c-64fca086fb54" containerName="extract" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.411807 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c113ac-ae1d-427b-892c-64fca086fb54" containerName="extract" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.411945 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c113ac-ae1d-427b-892c-64fca086fb54" containerName="extract" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.412618 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.415528 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-7mgtp" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.427586 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk"] Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.535145 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6xbx\" (UniqueName: \"kubernetes.io/projected/aa0a2065-4d7f-45a4-a13b-f7eb60da44fd-kube-api-access-r6xbx\") pod \"rabbitmq-cluster-operator-779fc9694b-m5pkk\" (UID: \"aa0a2065-4d7f-45a4-a13b-f7eb60da44fd\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.636967 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6xbx\" (UniqueName: \"kubernetes.io/projected/aa0a2065-4d7f-45a4-a13b-f7eb60da44fd-kube-api-access-r6xbx\") pod \"rabbitmq-cluster-operator-779fc9694b-m5pkk\" (UID: \"aa0a2065-4d7f-45a4-a13b-f7eb60da44fd\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.656634 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6xbx\" (UniqueName: \"kubernetes.io/projected/aa0a2065-4d7f-45a4-a13b-f7eb60da44fd-kube-api-access-r6xbx\") pod \"rabbitmq-cluster-operator-779fc9694b-m5pkk\" (UID: \"aa0a2065-4d7f-45a4-a13b-f7eb60da44fd\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk" Dec 03 12:42:25 crc kubenswrapper[4711]: I1203 12:42:25.735780 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk" Dec 03 12:42:26 crc kubenswrapper[4711]: I1203 12:42:26.024529 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk"] Dec 03 12:42:26 crc kubenswrapper[4711]: I1203 12:42:26.214050 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk" event={"ID":"aa0a2065-4d7f-45a4-a13b-f7eb60da44fd","Type":"ContainerStarted","Data":"97f648d1af90382b7afe472d174783e936a1deb7ac69a5a6b7051ae23f51abd0"} Dec 03 12:42:31 crc kubenswrapper[4711]: I1203 12:42:31.250973 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk" event={"ID":"aa0a2065-4d7f-45a4-a13b-f7eb60da44fd","Type":"ContainerStarted","Data":"d5aa272ba5456c37b8089724d567266200f9fc850777f37ee02572379da96d8c"} Dec 03 12:42:31 crc kubenswrapper[4711]: I1203 12:42:31.276406 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-m5pkk" podStartSLOduration=1.796695491 podStartE2EDuration="6.276311899s" podCreationTimestamp="2025-12-03 12:42:25 +0000 UTC" firstStartedPulling="2025-12-03 12:42:26.038893342 +0000 UTC m=+1664.708144597" lastFinishedPulling="2025-12-03 12:42:30.51850975 +0000 UTC m=+1669.187761005" observedRunningTime="2025-12-03 12:42:31.26913423 +0000 UTC m=+1669.938385495" watchObservedRunningTime="2025-12-03 12:42:31.276311899 +0000 UTC m=+1669.945563184" Dec 03 12:42:31 crc kubenswrapper[4711]: I1203 12:42:31.822123 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:42:31 crc kubenswrapper[4711]: E1203 12:42:31.822754 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.236390 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.237596 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.239459 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.239891 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.239927 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.241738 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-6qvlf" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.242542 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.259039 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.277043 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3211846a-954e-420b-8ed8-97a0d955aa2e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3211846a-954e-420b-8ed8-97a0d955aa2e\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.277112 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.277153 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b4ec49f-4f73-4f76-9945-01577848b79c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.277174 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llmpg\" (UniqueName: \"kubernetes.io/projected/3b4ec49f-4f73-4f76-9945-01577848b79c-kube-api-access-llmpg\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.277224 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b4ec49f-4f73-4f76-9945-01577848b79c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.277258 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b4ec49f-4f73-4f76-9945-01577848b79c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.277273 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.277304 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.379151 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3211846a-954e-420b-8ed8-97a0d955aa2e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3211846a-954e-420b-8ed8-97a0d955aa2e\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.379212 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.379269 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b4ec49f-4f73-4f76-9945-01577848b79c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.379291 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llmpg\" (UniqueName: \"kubernetes.io/projected/3b4ec49f-4f73-4f76-9945-01577848b79c-kube-api-access-llmpg\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.379328 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b4ec49f-4f73-4f76-9945-01577848b79c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.379371 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b4ec49f-4f73-4f76-9945-01577848b79c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.379392 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.379437 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.380042 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.381092 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b4ec49f-4f73-4f76-9945-01577848b79c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.381821 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.382367 4711 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.382398 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3211846a-954e-420b-8ed8-97a0d955aa2e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3211846a-954e-420b-8ed8-97a0d955aa2e\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5041d2e266f23df1516ff06983f856e6daf37abc821ddce599a0592e775cfc7c/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.386606 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b4ec49f-4f73-4f76-9945-01577848b79c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.394120 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b4ec49f-4f73-4f76-9945-01577848b79c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.394431 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b4ec49f-4f73-4f76-9945-01577848b79c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.402793 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llmpg\" (UniqueName: \"kubernetes.io/projected/3b4ec49f-4f73-4f76-9945-01577848b79c-kube-api-access-llmpg\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.425817 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3211846a-954e-420b-8ed8-97a0d955aa2e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3211846a-954e-420b-8ed8-97a0d955aa2e\") pod \"rabbitmq-server-0\" (UID: \"3b4ec49f-4f73-4f76-9945-01577848b79c\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.553510 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:42:34 crc kubenswrapper[4711]: I1203 12:42:34.828000 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 03 12:42:35 crc kubenswrapper[4711]: I1203 12:42:35.272847 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"3b4ec49f-4f73-4f76-9945-01577848b79c","Type":"ContainerStarted","Data":"e7a758437c57dcfcdb8a7c55455642962bf06f64e38c03bba0eddaeda2d7d851"} Dec 03 12:42:35 crc kubenswrapper[4711]: I1203 12:42:35.972802 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-8fd8f"] Dec 03 12:42:35 crc kubenswrapper[4711]: I1203 12:42:35.973969 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-8fd8f" Dec 03 12:42:35 crc kubenswrapper[4711]: I1203 12:42:35.985849 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-hcmgz" Dec 03 12:42:35 crc kubenswrapper[4711]: I1203 12:42:35.994098 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-8fd8f"] Dec 03 12:42:36 crc kubenswrapper[4711]: I1203 12:42:36.010094 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-967mx\" (UniqueName: \"kubernetes.io/projected/10ee7190-a223-47a2-9e03-081431e48b74-kube-api-access-967mx\") pod \"keystone-operator-index-8fd8f\" (UID: \"10ee7190-a223-47a2-9e03-081431e48b74\") " pod="openstack-operators/keystone-operator-index-8fd8f" Dec 03 12:42:36 crc kubenswrapper[4711]: I1203 12:42:36.111994 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-967mx\" (UniqueName: \"kubernetes.io/projected/10ee7190-a223-47a2-9e03-081431e48b74-kube-api-access-967mx\") pod \"keystone-operator-index-8fd8f\" (UID: \"10ee7190-a223-47a2-9e03-081431e48b74\") " pod="openstack-operators/keystone-operator-index-8fd8f" Dec 03 12:42:36 crc kubenswrapper[4711]: I1203 12:42:36.132596 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-967mx\" (UniqueName: \"kubernetes.io/projected/10ee7190-a223-47a2-9e03-081431e48b74-kube-api-access-967mx\") pod \"keystone-operator-index-8fd8f\" (UID: \"10ee7190-a223-47a2-9e03-081431e48b74\") " pod="openstack-operators/keystone-operator-index-8fd8f" Dec 03 12:42:36 crc kubenswrapper[4711]: I1203 12:42:36.304073 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-8fd8f" Dec 03 12:42:36 crc kubenswrapper[4711]: I1203 12:42:36.714772 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-8fd8f"] Dec 03 12:42:37 crc kubenswrapper[4711]: W1203 12:42:37.629804 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10ee7190_a223_47a2_9e03_081431e48b74.slice/crio-11bf3725c3872206dede5429a0f1aeea5debefa31ca7040d84f86b7a523fd6ab WatchSource:0}: Error finding container 11bf3725c3872206dede5429a0f1aeea5debefa31ca7040d84f86b7a523fd6ab: Status 404 returned error can't find the container with id 11bf3725c3872206dede5429a0f1aeea5debefa31ca7040d84f86b7a523fd6ab Dec 03 12:42:38 crc kubenswrapper[4711]: I1203 12:42:38.296481 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-8fd8f" event={"ID":"10ee7190-a223-47a2-9e03-081431e48b74","Type":"ContainerStarted","Data":"11bf3725c3872206dede5429a0f1aeea5debefa31ca7040d84f86b7a523fd6ab"} Dec 03 12:42:40 crc kubenswrapper[4711]: I1203 12:42:40.367089 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-8fd8f"] Dec 03 12:42:40 crc kubenswrapper[4711]: I1203 12:42:40.970799 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-26962"] Dec 03 12:42:40 crc kubenswrapper[4711]: I1203 12:42:40.973169 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:40 crc kubenswrapper[4711]: I1203 12:42:40.980798 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-26962"] Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.010340 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qjz9\" (UniqueName: \"kubernetes.io/projected/6c07e470-357e-4332-a4f2-e713b2a9e485-kube-api-access-9qjz9\") pod \"keystone-operator-index-26962\" (UID: \"6c07e470-357e-4332-a4f2-e713b2a9e485\") " pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.112159 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qjz9\" (UniqueName: \"kubernetes.io/projected/6c07e470-357e-4332-a4f2-e713b2a9e485-kube-api-access-9qjz9\") pod \"keystone-operator-index-26962\" (UID: \"6c07e470-357e-4332-a4f2-e713b2a9e485\") " pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.139560 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qjz9\" (UniqueName: \"kubernetes.io/projected/6c07e470-357e-4332-a4f2-e713b2a9e485-kube-api-access-9qjz9\") pod \"keystone-operator-index-26962\" (UID: \"6c07e470-357e-4332-a4f2-e713b2a9e485\") " pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.319023 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-8fd8f" event={"ID":"10ee7190-a223-47a2-9e03-081431e48b74","Type":"ContainerStarted","Data":"0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606"} Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.319130 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-8fd8f" podUID="10ee7190-a223-47a2-9e03-081431e48b74" containerName="registry-server" containerID="cri-o://0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606" gracePeriod=2 Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.335183 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-8fd8f" podStartSLOduration=3.02308276 podStartE2EDuration="6.335161224s" podCreationTimestamp="2025-12-03 12:42:35 +0000 UTC" firstStartedPulling="2025-12-03 12:42:37.633744977 +0000 UTC m=+1676.302996232" lastFinishedPulling="2025-12-03 12:42:40.945823431 +0000 UTC m=+1679.615074696" observedRunningTime="2025-12-03 12:42:41.329877908 +0000 UTC m=+1679.999129173" watchObservedRunningTime="2025-12-03 12:42:41.335161224 +0000 UTC m=+1680.004412479" Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.398564 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.594881 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-26962"] Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.737785 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-8fd8f" Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.823544 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-967mx\" (UniqueName: \"kubernetes.io/projected/10ee7190-a223-47a2-9e03-081431e48b74-kube-api-access-967mx\") pod \"10ee7190-a223-47a2-9e03-081431e48b74\" (UID: \"10ee7190-a223-47a2-9e03-081431e48b74\") " Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.849301 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ee7190-a223-47a2-9e03-081431e48b74-kube-api-access-967mx" (OuterVolumeSpecName: "kube-api-access-967mx") pod "10ee7190-a223-47a2-9e03-081431e48b74" (UID: "10ee7190-a223-47a2-9e03-081431e48b74"). InnerVolumeSpecName "kube-api-access-967mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:42:41 crc kubenswrapper[4711]: I1203 12:42:41.925699 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-967mx\" (UniqueName: \"kubernetes.io/projected/10ee7190-a223-47a2-9e03-081431e48b74-kube-api-access-967mx\") on node \"crc\" DevicePath \"\"" Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.327197 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"3b4ec49f-4f73-4f76-9945-01577848b79c","Type":"ContainerStarted","Data":"e697619310d41a4f6e93095ec1836329f8fdc371d8bdb12d2801b13da1e373f0"} Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.329109 4711 generic.go:334] "Generic (PLEG): container finished" podID="10ee7190-a223-47a2-9e03-081431e48b74" containerID="0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606" exitCode=0 Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.329231 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-8fd8f" event={"ID":"10ee7190-a223-47a2-9e03-081431e48b74","Type":"ContainerDied","Data":"0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606"} Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.329284 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-8fd8f" event={"ID":"10ee7190-a223-47a2-9e03-081431e48b74","Type":"ContainerDied","Data":"11bf3725c3872206dede5429a0f1aeea5debefa31ca7040d84f86b7a523fd6ab"} Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.329302 4711 scope.go:117] "RemoveContainer" containerID="0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606" Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.329465 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-8fd8f" Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.330979 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-26962" event={"ID":"6c07e470-357e-4332-a4f2-e713b2a9e485","Type":"ContainerStarted","Data":"be0378802e5ed768d0abe80fc8d6b255ce8decfec856c21800fb39eb17a1f297"} Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.331010 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-26962" event={"ID":"6c07e470-357e-4332-a4f2-e713b2a9e485","Type":"ContainerStarted","Data":"ad77b8f9270dbe9ccc04cb22446b4bc1967b95ad9e4951a44e921999367b646e"} Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.361570 4711 scope.go:117] "RemoveContainer" containerID="0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606" Dec 03 12:42:42 crc kubenswrapper[4711]: E1203 12:42:42.362545 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606\": container with ID starting with 0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606 not found: ID does not exist" containerID="0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606" Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.362596 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606"} err="failed to get container status \"0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606\": rpc error: code = NotFound desc = could not find container \"0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606\": container with ID starting with 0655227b78c88002452ef4bdb10a23e45a03a4b9485c66aae3e0875bf7976606 not found: ID does not exist" Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.370435 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-8fd8f"] Dec 03 12:42:42 crc kubenswrapper[4711]: I1203 12:42:42.377742 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-8fd8f"] Dec 03 12:42:43 crc kubenswrapper[4711]: I1203 12:42:43.829692 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ee7190-a223-47a2-9e03-081431e48b74" path="/var/lib/kubelet/pods/10ee7190-a223-47a2-9e03-081431e48b74/volumes" Dec 03 12:42:46 crc kubenswrapper[4711]: I1203 12:42:46.819780 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:42:46 crc kubenswrapper[4711]: E1203 12:42:46.820475 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:42:51 crc kubenswrapper[4711]: I1203 12:42:51.398673 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:51 crc kubenswrapper[4711]: I1203 12:42:51.399030 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:51 crc kubenswrapper[4711]: I1203 12:42:51.430028 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:51 crc kubenswrapper[4711]: I1203 12:42:51.461186 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-26962" podStartSLOduration=11.003737758 podStartE2EDuration="11.461156698s" podCreationTimestamp="2025-12-03 12:42:40 +0000 UTC" firstStartedPulling="2025-12-03 12:42:41.60753466 +0000 UTC m=+1680.276785915" lastFinishedPulling="2025-12-03 12:42:42.0649536 +0000 UTC m=+1680.734204855" observedRunningTime="2025-12-03 12:42:42.380174437 +0000 UTC m=+1681.049425722" watchObservedRunningTime="2025-12-03 12:42:51.461156698 +0000 UTC m=+1690.130407993" Dec 03 12:42:52 crc kubenswrapper[4711]: I1203 12:42:52.432487 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-26962" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.201943 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd"] Dec 03 12:42:54 crc kubenswrapper[4711]: E1203 12:42:54.202228 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ee7190-a223-47a2-9e03-081431e48b74" containerName="registry-server" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.202244 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ee7190-a223-47a2-9e03-081431e48b74" containerName="registry-server" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.202371 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ee7190-a223-47a2-9e03-081431e48b74" containerName="registry-server" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.203287 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.206238 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gw8hz" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.211459 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd"] Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.301817 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsm87\" (UniqueName: \"kubernetes.io/projected/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-kube-api-access-lsm87\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.301879 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-bundle\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.302183 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-util\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.403894 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsm87\" (UniqueName: \"kubernetes.io/projected/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-kube-api-access-lsm87\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.404043 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-bundle\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.404201 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-util\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.404610 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-bundle\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.405005 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-util\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.425490 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsm87\" (UniqueName: \"kubernetes.io/projected/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-kube-api-access-lsm87\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:54 crc kubenswrapper[4711]: I1203 12:42:54.523764 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:55 crc kubenswrapper[4711]: I1203 12:42:55.006523 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd"] Dec 03 12:42:55 crc kubenswrapper[4711]: I1203 12:42:55.427300 4711 generic.go:334] "Generic (PLEG): container finished" podID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerID="b415361cc341afc2f41781bb004d2574219f0da06e708adb0d14950db84c5b18" exitCode=0 Dec 03 12:42:55 crc kubenswrapper[4711]: I1203 12:42:55.427368 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" event={"ID":"6cf0822e-a898-4a5a-9cdd-a10e0a56f468","Type":"ContainerDied","Data":"b415361cc341afc2f41781bb004d2574219f0da06e708adb0d14950db84c5b18"} Dec 03 12:42:55 crc kubenswrapper[4711]: I1203 12:42:55.427623 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" event={"ID":"6cf0822e-a898-4a5a-9cdd-a10e0a56f468","Type":"ContainerStarted","Data":"1a3c2ea841af592b13ab7a927df601707a9245cb9a67787b0f96e2660a3dea46"} Dec 03 12:42:56 crc kubenswrapper[4711]: I1203 12:42:56.436564 4711 generic.go:334] "Generic (PLEG): container finished" podID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerID="48b561894da6dd0c52e29164281a0c34b14ca13d102cef69ebf5b52ba8ab219c" exitCode=0 Dec 03 12:42:56 crc kubenswrapper[4711]: I1203 12:42:56.436619 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" event={"ID":"6cf0822e-a898-4a5a-9cdd-a10e0a56f468","Type":"ContainerDied","Data":"48b561894da6dd0c52e29164281a0c34b14ca13d102cef69ebf5b52ba8ab219c"} Dec 03 12:42:57 crc kubenswrapper[4711]: I1203 12:42:57.445734 4711 generic.go:334] "Generic (PLEG): container finished" podID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerID="d8b03a28c64418f2ec07217d2550cb85bdbe3adc014f46bdc2c7ac28f8655d17" exitCode=0 Dec 03 12:42:57 crc kubenswrapper[4711]: I1203 12:42:57.445778 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" event={"ID":"6cf0822e-a898-4a5a-9cdd-a10e0a56f468","Type":"ContainerDied","Data":"d8b03a28c64418f2ec07217d2550cb85bdbe3adc014f46bdc2c7ac28f8655d17"} Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.809006 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.821783 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:42:58 crc kubenswrapper[4711]: E1203 12:42:58.822084 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.866322 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-bundle\") pod \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.866383 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsm87\" (UniqueName: \"kubernetes.io/projected/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-kube-api-access-lsm87\") pod \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.866420 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-util\") pod \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\" (UID: \"6cf0822e-a898-4a5a-9cdd-a10e0a56f468\") " Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.867528 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-bundle" (OuterVolumeSpecName: "bundle") pod "6cf0822e-a898-4a5a-9cdd-a10e0a56f468" (UID: "6cf0822e-a898-4a5a-9cdd-a10e0a56f468"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.873577 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-kube-api-access-lsm87" (OuterVolumeSpecName: "kube-api-access-lsm87") pod "6cf0822e-a898-4a5a-9cdd-a10e0a56f468" (UID: "6cf0822e-a898-4a5a-9cdd-a10e0a56f468"). InnerVolumeSpecName "kube-api-access-lsm87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.881148 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-util" (OuterVolumeSpecName: "util") pod "6cf0822e-a898-4a5a-9cdd-a10e0a56f468" (UID: "6cf0822e-a898-4a5a-9cdd-a10e0a56f468"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.968458 4711 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.968511 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsm87\" (UniqueName: \"kubernetes.io/projected/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-kube-api-access-lsm87\") on node \"crc\" DevicePath \"\"" Dec 03 12:42:58 crc kubenswrapper[4711]: I1203 12:42:58.968532 4711 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf0822e-a898-4a5a-9cdd-a10e0a56f468-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:42:59 crc kubenswrapper[4711]: I1203 12:42:59.463507 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" event={"ID":"6cf0822e-a898-4a5a-9cdd-a10e0a56f468","Type":"ContainerDied","Data":"1a3c2ea841af592b13ab7a927df601707a9245cb9a67787b0f96e2660a3dea46"} Dec 03 12:42:59 crc kubenswrapper[4711]: I1203 12:42:59.463561 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a3c2ea841af592b13ab7a927df601707a9245cb9a67787b0f96e2660a3dea46" Dec 03 12:42:59 crc kubenswrapper[4711]: I1203 12:42:59.463587 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.103787 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn"] Dec 03 12:43:08 crc kubenswrapper[4711]: E1203 12:43:08.104665 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerName="extract" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.104681 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerName="extract" Dec 03 12:43:08 crc kubenswrapper[4711]: E1203 12:43:08.104708 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerName="util" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.104717 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerName="util" Dec 03 12:43:08 crc kubenswrapper[4711]: E1203 12:43:08.104726 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerName="pull" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.104734 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerName="pull" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.104865 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf0822e-a898-4a5a-9cdd-a10e0a56f468" containerName="extract" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.105394 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.107778 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.108002 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jjp8p" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.111372 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-webhook-cert\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.111417 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-apiservice-cert\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.111520 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7gc7\" (UniqueName: \"kubernetes.io/projected/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-kube-api-access-z7gc7\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.117497 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn"] Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.213130 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gc7\" (UniqueName: \"kubernetes.io/projected/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-kube-api-access-z7gc7\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.213214 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-webhook-cert\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.213266 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-apiservice-cert\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.219951 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-webhook-cert\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.220464 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-apiservice-cert\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.229997 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7gc7\" (UniqueName: \"kubernetes.io/projected/8a3b9c74-bb7e-48cb-926d-5604a3bdb65c-kube-api-access-z7gc7\") pod \"keystone-operator-controller-manager-55bc77bd75-m9gpn\" (UID: \"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c\") " pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.426976 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:08 crc kubenswrapper[4711]: I1203 12:43:08.852980 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn"] Dec 03 12:43:09 crc kubenswrapper[4711]: I1203 12:43:09.535277 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" event={"ID":"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c","Type":"ContainerStarted","Data":"c91a38603a0f2beb83bbd63a5901f4400f1236536bfb6e6d9556dc1850c247e3"} Dec 03 12:43:13 crc kubenswrapper[4711]: I1203 12:43:13.817288 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:43:13 crc kubenswrapper[4711]: E1203 12:43:13.818016 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:43:14 crc kubenswrapper[4711]: I1203 12:43:14.590482 4711 generic.go:334] "Generic (PLEG): container finished" podID="3b4ec49f-4f73-4f76-9945-01577848b79c" containerID="e697619310d41a4f6e93095ec1836329f8fdc371d8bdb12d2801b13da1e373f0" exitCode=0 Dec 03 12:43:14 crc kubenswrapper[4711]: I1203 12:43:14.590579 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"3b4ec49f-4f73-4f76-9945-01577848b79c","Type":"ContainerDied","Data":"e697619310d41a4f6e93095ec1836329f8fdc371d8bdb12d2801b13da1e373f0"} Dec 03 12:43:17 crc kubenswrapper[4711]: I1203 12:43:17.612159 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"3b4ec49f-4f73-4f76-9945-01577848b79c","Type":"ContainerStarted","Data":"8dd10868966e003181eccfecf536d5c6116b87da1ebdec63db9075f26876ca11"} Dec 03 12:43:17 crc kubenswrapper[4711]: I1203 12:43:17.613011 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:43:17 crc kubenswrapper[4711]: I1203 12:43:17.614701 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" event={"ID":"8a3b9c74-bb7e-48cb-926d-5604a3bdb65c","Type":"ContainerStarted","Data":"7420f26d6db55986615ffb8a33422fb16ea80269ae5eff5ea66f355250a3fcb4"} Dec 03 12:43:17 crc kubenswrapper[4711]: I1203 12:43:17.614958 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:17 crc kubenswrapper[4711]: I1203 12:43:17.635545 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=38.537681129 podStartE2EDuration="44.635526566s" podCreationTimestamp="2025-12-03 12:42:33 +0000 UTC" firstStartedPulling="2025-12-03 12:42:34.844306663 +0000 UTC m=+1673.513557918" lastFinishedPulling="2025-12-03 12:42:40.9421521 +0000 UTC m=+1679.611403355" observedRunningTime="2025-12-03 12:43:17.632171473 +0000 UTC m=+1716.301422748" watchObservedRunningTime="2025-12-03 12:43:17.635526566 +0000 UTC m=+1716.304777821" Dec 03 12:43:17 crc kubenswrapper[4711]: I1203 12:43:17.657993 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" podStartSLOduration=1.9616097 podStartE2EDuration="9.657975475s" podCreationTimestamp="2025-12-03 12:43:08 +0000 UTC" firstStartedPulling="2025-12-03 12:43:08.858502026 +0000 UTC m=+1707.527753281" lastFinishedPulling="2025-12-03 12:43:16.554867801 +0000 UTC m=+1715.224119056" observedRunningTime="2025-12-03 12:43:17.655086696 +0000 UTC m=+1716.324337961" watchObservedRunningTime="2025-12-03 12:43:17.657975475 +0000 UTC m=+1716.327226730" Dec 03 12:43:25 crc kubenswrapper[4711]: I1203 12:43:25.817085 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:43:25 crc kubenswrapper[4711]: E1203 12:43:25.817900 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:43:28 crc kubenswrapper[4711]: I1203 12:43:28.431870 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55bc77bd75-m9gpn" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.502939 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-xmfcd"] Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.504330 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.508808 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-2462-account-create-update-spvfj"] Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.509742 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.511761 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.520936 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-2462-account-create-update-spvfj"] Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.525347 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-xmfcd"] Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.588553 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m6pn\" (UniqueName: \"kubernetes.io/projected/2112d2ef-102c-4a01-8124-6b17a34f50a5-kube-api-access-2m6pn\") pod \"keystone-2462-account-create-update-spvfj\" (UID: \"2112d2ef-102c-4a01-8124-6b17a34f50a5\") " pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.588633 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb08f39-558e-497e-a3dc-f01b252384e4-operator-scripts\") pod \"keystone-db-create-xmfcd\" (UID: \"3eb08f39-558e-497e-a3dc-f01b252384e4\") " pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.588685 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2112d2ef-102c-4a01-8124-6b17a34f50a5-operator-scripts\") pod \"keystone-2462-account-create-update-spvfj\" (UID: \"2112d2ef-102c-4a01-8124-6b17a34f50a5\") " pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.588780 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtkdz\" (UniqueName: \"kubernetes.io/projected/3eb08f39-558e-497e-a3dc-f01b252384e4-kube-api-access-gtkdz\") pod \"keystone-db-create-xmfcd\" (UID: \"3eb08f39-558e-497e-a3dc-f01b252384e4\") " pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.690613 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb08f39-558e-497e-a3dc-f01b252384e4-operator-scripts\") pod \"keystone-db-create-xmfcd\" (UID: \"3eb08f39-558e-497e-a3dc-f01b252384e4\") " pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.690687 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2112d2ef-102c-4a01-8124-6b17a34f50a5-operator-scripts\") pod \"keystone-2462-account-create-update-spvfj\" (UID: \"2112d2ef-102c-4a01-8124-6b17a34f50a5\") " pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.690763 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtkdz\" (UniqueName: \"kubernetes.io/projected/3eb08f39-558e-497e-a3dc-f01b252384e4-kube-api-access-gtkdz\") pod \"keystone-db-create-xmfcd\" (UID: \"3eb08f39-558e-497e-a3dc-f01b252384e4\") " pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.690825 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m6pn\" (UniqueName: \"kubernetes.io/projected/2112d2ef-102c-4a01-8124-6b17a34f50a5-kube-api-access-2m6pn\") pod \"keystone-2462-account-create-update-spvfj\" (UID: \"2112d2ef-102c-4a01-8124-6b17a34f50a5\") " pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.691586 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb08f39-558e-497e-a3dc-f01b252384e4-operator-scripts\") pod \"keystone-db-create-xmfcd\" (UID: \"3eb08f39-558e-497e-a3dc-f01b252384e4\") " pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.691667 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2112d2ef-102c-4a01-8124-6b17a34f50a5-operator-scripts\") pod \"keystone-2462-account-create-update-spvfj\" (UID: \"2112d2ef-102c-4a01-8124-6b17a34f50a5\") " pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.719178 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtkdz\" (UniqueName: \"kubernetes.io/projected/3eb08f39-558e-497e-a3dc-f01b252384e4-kube-api-access-gtkdz\") pod \"keystone-db-create-xmfcd\" (UID: \"3eb08f39-558e-497e-a3dc-f01b252384e4\") " pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.736104 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m6pn\" (UniqueName: \"kubernetes.io/projected/2112d2ef-102c-4a01-8124-6b17a34f50a5-kube-api-access-2m6pn\") pod \"keystone-2462-account-create-update-spvfj\" (UID: \"2112d2ef-102c-4a01-8124-6b17a34f50a5\") " pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.820963 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:33 crc kubenswrapper[4711]: I1203 12:43:33.831138 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.151711 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-xmfcd"] Dec 03 12:43:34 crc kubenswrapper[4711]: W1203 12:43:34.162581 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb08f39_558e_497e_a3dc_f01b252384e4.slice/crio-c87c126ad1dc3f01b66a86248941ab33db261cf92b8fb2260cd12ee3efba6b17 WatchSource:0}: Error finding container c87c126ad1dc3f01b66a86248941ab33db261cf92b8fb2260cd12ee3efba6b17: Status 404 returned error can't find the container with id c87c126ad1dc3f01b66a86248941ab33db261cf92b8fb2260cd12ee3efba6b17 Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.210588 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-2462-account-create-update-spvfj"] Dec 03 12:43:34 crc kubenswrapper[4711]: W1203 12:43:34.234249 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2112d2ef_102c_4a01_8124_6b17a34f50a5.slice/crio-b0231803b3de9e524255024233c8b9c8707521e2cdd4673f08d55b4b98fa8fee WatchSource:0}: Error finding container b0231803b3de9e524255024233c8b9c8707521e2cdd4673f08d55b4b98fa8fee: Status 404 returned error can't find the container with id b0231803b3de9e524255024233c8b9c8707521e2cdd4673f08d55b4b98fa8fee Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.556930 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.740617 4711 generic.go:334] "Generic (PLEG): container finished" podID="2112d2ef-102c-4a01-8124-6b17a34f50a5" containerID="fd11cddbff4440c60d9f8f3dac30f6f6f5d1df4ab243dbf34604304c43621b60" exitCode=0 Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.740689 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" event={"ID":"2112d2ef-102c-4a01-8124-6b17a34f50a5","Type":"ContainerDied","Data":"fd11cddbff4440c60d9f8f3dac30f6f6f5d1df4ab243dbf34604304c43621b60"} Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.740722 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" event={"ID":"2112d2ef-102c-4a01-8124-6b17a34f50a5","Type":"ContainerStarted","Data":"b0231803b3de9e524255024233c8b9c8707521e2cdd4673f08d55b4b98fa8fee"} Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.743320 4711 generic.go:334] "Generic (PLEG): container finished" podID="3eb08f39-558e-497e-a3dc-f01b252384e4" containerID="ba23666c2a5b4741700eedaa5fdc01b243680e994a0f699f8432cccdca8ea50f" exitCode=0 Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.743365 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-xmfcd" event={"ID":"3eb08f39-558e-497e-a3dc-f01b252384e4","Type":"ContainerDied","Data":"ba23666c2a5b4741700eedaa5fdc01b243680e994a0f699f8432cccdca8ea50f"} Dec 03 12:43:34 crc kubenswrapper[4711]: I1203 12:43:34.743387 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-xmfcd" event={"ID":"3eb08f39-558e-497e-a3dc-f01b252384e4","Type":"ContainerStarted","Data":"c87c126ad1dc3f01b66a86248941ab33db261cf92b8fb2260cd12ee3efba6b17"} Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.048761 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.053205 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.126512 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb08f39-558e-497e-a3dc-f01b252384e4-operator-scripts\") pod \"3eb08f39-558e-497e-a3dc-f01b252384e4\" (UID: \"3eb08f39-558e-497e-a3dc-f01b252384e4\") " Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.126570 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtkdz\" (UniqueName: \"kubernetes.io/projected/3eb08f39-558e-497e-a3dc-f01b252384e4-kube-api-access-gtkdz\") pod \"3eb08f39-558e-497e-a3dc-f01b252384e4\" (UID: \"3eb08f39-558e-497e-a3dc-f01b252384e4\") " Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.126675 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2112d2ef-102c-4a01-8124-6b17a34f50a5-operator-scripts\") pod \"2112d2ef-102c-4a01-8124-6b17a34f50a5\" (UID: \"2112d2ef-102c-4a01-8124-6b17a34f50a5\") " Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.126709 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m6pn\" (UniqueName: \"kubernetes.io/projected/2112d2ef-102c-4a01-8124-6b17a34f50a5-kube-api-access-2m6pn\") pod \"2112d2ef-102c-4a01-8124-6b17a34f50a5\" (UID: \"2112d2ef-102c-4a01-8124-6b17a34f50a5\") " Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.127893 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb08f39-558e-497e-a3dc-f01b252384e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3eb08f39-558e-497e-a3dc-f01b252384e4" (UID: "3eb08f39-558e-497e-a3dc-f01b252384e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.128216 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2112d2ef-102c-4a01-8124-6b17a34f50a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2112d2ef-102c-4a01-8124-6b17a34f50a5" (UID: "2112d2ef-102c-4a01-8124-6b17a34f50a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.132279 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb08f39-558e-497e-a3dc-f01b252384e4-kube-api-access-gtkdz" (OuterVolumeSpecName: "kube-api-access-gtkdz") pod "3eb08f39-558e-497e-a3dc-f01b252384e4" (UID: "3eb08f39-558e-497e-a3dc-f01b252384e4"). InnerVolumeSpecName "kube-api-access-gtkdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.132317 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2112d2ef-102c-4a01-8124-6b17a34f50a5-kube-api-access-2m6pn" (OuterVolumeSpecName: "kube-api-access-2m6pn") pod "2112d2ef-102c-4a01-8124-6b17a34f50a5" (UID: "2112d2ef-102c-4a01-8124-6b17a34f50a5"). InnerVolumeSpecName "kube-api-access-2m6pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.228201 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2112d2ef-102c-4a01-8124-6b17a34f50a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.228250 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m6pn\" (UniqueName: \"kubernetes.io/projected/2112d2ef-102c-4a01-8124-6b17a34f50a5-kube-api-access-2m6pn\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.228265 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb08f39-558e-497e-a3dc-f01b252384e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.228277 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtkdz\" (UniqueName: \"kubernetes.io/projected/3eb08f39-558e-497e-a3dc-f01b252384e4-kube-api-access-gtkdz\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.368783 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-b29vt"] Dec 03 12:43:36 crc kubenswrapper[4711]: E1203 12:43:36.369125 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2112d2ef-102c-4a01-8124-6b17a34f50a5" containerName="mariadb-account-create-update" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.369144 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="2112d2ef-102c-4a01-8124-6b17a34f50a5" containerName="mariadb-account-create-update" Dec 03 12:43:36 crc kubenswrapper[4711]: E1203 12:43:36.369166 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb08f39-558e-497e-a3dc-f01b252384e4" containerName="mariadb-database-create" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.369177 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb08f39-558e-497e-a3dc-f01b252384e4" containerName="mariadb-database-create" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.369337 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="2112d2ef-102c-4a01-8124-6b17a34f50a5" containerName="mariadb-account-create-update" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.369376 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb08f39-558e-497e-a3dc-f01b252384e4" containerName="mariadb-database-create" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.369984 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-b29vt" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.372576 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-7b4xz" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.386521 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-b29vt"] Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.431491 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drx9g\" (UniqueName: \"kubernetes.io/projected/c81f51fc-d96d-41de-8092-16a361a1e1cc-kube-api-access-drx9g\") pod \"horizon-operator-index-b29vt\" (UID: \"c81f51fc-d96d-41de-8092-16a361a1e1cc\") " pod="openstack-operators/horizon-operator-index-b29vt" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.533244 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drx9g\" (UniqueName: \"kubernetes.io/projected/c81f51fc-d96d-41de-8092-16a361a1e1cc-kube-api-access-drx9g\") pod \"horizon-operator-index-b29vt\" (UID: \"c81f51fc-d96d-41de-8092-16a361a1e1cc\") " pod="openstack-operators/horizon-operator-index-b29vt" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.556954 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drx9g\" (UniqueName: \"kubernetes.io/projected/c81f51fc-d96d-41de-8092-16a361a1e1cc-kube-api-access-drx9g\") pod \"horizon-operator-index-b29vt\" (UID: \"c81f51fc-d96d-41de-8092-16a361a1e1cc\") " pod="openstack-operators/horizon-operator-index-b29vt" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.687890 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-b29vt" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.764572 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" event={"ID":"2112d2ef-102c-4a01-8124-6b17a34f50a5","Type":"ContainerDied","Data":"b0231803b3de9e524255024233c8b9c8707521e2cdd4673f08d55b4b98fa8fee"} Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.764867 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0231803b3de9e524255024233c8b9c8707521e2cdd4673f08d55b4b98fa8fee" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.764585 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-2462-account-create-update-spvfj" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.766527 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-xmfcd" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.766523 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-xmfcd" event={"ID":"3eb08f39-558e-497e-a3dc-f01b252384e4","Type":"ContainerDied","Data":"c87c126ad1dc3f01b66a86248941ab33db261cf92b8fb2260cd12ee3efba6b17"} Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.766567 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87c126ad1dc3f01b66a86248941ab33db261cf92b8fb2260cd12ee3efba6b17" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.817364 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:43:36 crc kubenswrapper[4711]: E1203 12:43:36.818054 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.892134 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-b29vt"] Dec 03 12:43:36 crc kubenswrapper[4711]: I1203 12:43:36.905487 4711 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:43:37 crc kubenswrapper[4711]: I1203 12:43:37.774158 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-b29vt" event={"ID":"c81f51fc-d96d-41de-8092-16a361a1e1cc","Type":"ContainerStarted","Data":"3b5eeece610102e3f9d78e60263bab630263da1843c8686b077d43d43872ae46"} Dec 03 12:43:38 crc kubenswrapper[4711]: I1203 12:43:38.978113 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k5dbg"] Dec 03 12:43:38 crc kubenswrapper[4711]: I1203 12:43:38.980267 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:38 crc kubenswrapper[4711]: I1203 12:43:38.982640 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Dec 03 12:43:38 crc kubenswrapper[4711]: I1203 12:43:38.983071 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-89xf2" Dec 03 12:43:38 crc kubenswrapper[4711]: I1203 12:43:38.983428 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Dec 03 12:43:38 crc kubenswrapper[4711]: I1203 12:43:38.983795 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.001505 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k5dbg"] Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.070928 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5f9\" (UniqueName: \"kubernetes.io/projected/6f3afe72-bd28-46b4-9956-be4f9727c405-kube-api-access-6z5f9\") pod \"keystone-db-sync-k5dbg\" (UID: \"6f3afe72-bd28-46b4-9956-be4f9727c405\") " pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.071318 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3afe72-bd28-46b4-9956-be4f9727c405-config-data\") pod \"keystone-db-sync-k5dbg\" (UID: \"6f3afe72-bd28-46b4-9956-be4f9727c405\") " pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.172481 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5f9\" (UniqueName: \"kubernetes.io/projected/6f3afe72-bd28-46b4-9956-be4f9727c405-kube-api-access-6z5f9\") pod \"keystone-db-sync-k5dbg\" (UID: \"6f3afe72-bd28-46b4-9956-be4f9727c405\") " pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.172559 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3afe72-bd28-46b4-9956-be4f9727c405-config-data\") pod \"keystone-db-sync-k5dbg\" (UID: \"6f3afe72-bd28-46b4-9956-be4f9727c405\") " pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.179246 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3afe72-bd28-46b4-9956-be4f9727c405-config-data\") pod \"keystone-db-sync-k5dbg\" (UID: \"6f3afe72-bd28-46b4-9956-be4f9727c405\") " pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.190725 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5f9\" (UniqueName: \"kubernetes.io/projected/6f3afe72-bd28-46b4-9956-be4f9727c405-kube-api-access-6z5f9\") pod \"keystone-db-sync-k5dbg\" (UID: \"6f3afe72-bd28-46b4-9956-be4f9727c405\") " pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.297692 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.752136 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k5dbg"] Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.776670 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-lw8pr"] Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.777463 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-lw8pr" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.780462 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-qds8g" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.788211 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-lw8pr"] Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.889763 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mngrd\" (UniqueName: \"kubernetes.io/projected/410cf095-14f3-44ce-8308-1af8c60b99db-kube-api-access-mngrd\") pod \"swift-operator-index-lw8pr\" (UID: \"410cf095-14f3-44ce-8308-1af8c60b99db\") " pod="openstack-operators/swift-operator-index-lw8pr" Dec 03 12:43:39 crc kubenswrapper[4711]: I1203 12:43:39.990956 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mngrd\" (UniqueName: \"kubernetes.io/projected/410cf095-14f3-44ce-8308-1af8c60b99db-kube-api-access-mngrd\") pod \"swift-operator-index-lw8pr\" (UID: \"410cf095-14f3-44ce-8308-1af8c60b99db\") " pod="openstack-operators/swift-operator-index-lw8pr" Dec 03 12:43:40 crc kubenswrapper[4711]: I1203 12:43:40.011498 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mngrd\" (UniqueName: \"kubernetes.io/projected/410cf095-14f3-44ce-8308-1af8c60b99db-kube-api-access-mngrd\") pod \"swift-operator-index-lw8pr\" (UID: \"410cf095-14f3-44ce-8308-1af8c60b99db\") " pod="openstack-operators/swift-operator-index-lw8pr" Dec 03 12:43:40 crc kubenswrapper[4711]: I1203 12:43:40.100073 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-lw8pr" Dec 03 12:43:40 crc kubenswrapper[4711]: I1203 12:43:40.509979 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-lw8pr"] Dec 03 12:43:40 crc kubenswrapper[4711]: W1203 12:43:40.515508 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod410cf095_14f3_44ce_8308_1af8c60b99db.slice/crio-7c4d25b79b021639a8a3148132b1339fc30ab12cef1502eeb13ea3c4954f39b7 WatchSource:0}: Error finding container 7c4d25b79b021639a8a3148132b1339fc30ab12cef1502eeb13ea3c4954f39b7: Status 404 returned error can't find the container with id 7c4d25b79b021639a8a3148132b1339fc30ab12cef1502eeb13ea3c4954f39b7 Dec 03 12:43:40 crc kubenswrapper[4711]: I1203 12:43:40.805534 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-lw8pr" event={"ID":"410cf095-14f3-44ce-8308-1af8c60b99db","Type":"ContainerStarted","Data":"7c4d25b79b021639a8a3148132b1339fc30ab12cef1502eeb13ea3c4954f39b7"} Dec 03 12:43:40 crc kubenswrapper[4711]: I1203 12:43:40.807569 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" event={"ID":"6f3afe72-bd28-46b4-9956-be4f9727c405","Type":"ContainerStarted","Data":"b163129c09a65f7c17af33492b5357aa59b5528fc927fd79938ab4d022c58dac"} Dec 03 12:43:40 crc kubenswrapper[4711]: I1203 12:43:40.809279 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-b29vt" event={"ID":"c81f51fc-d96d-41de-8092-16a361a1e1cc","Type":"ContainerStarted","Data":"bc19013a7aa9ed950df687fd85eb1aefadf56c717b737013fbaf70cf7a7db10b"} Dec 03 12:43:41 crc kubenswrapper[4711]: I1203 12:43:41.847130 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-b29vt" podStartSLOduration=2.854104452 podStartE2EDuration="5.847109819s" podCreationTimestamp="2025-12-03 12:43:36 +0000 UTC" firstStartedPulling="2025-12-03 12:43:36.905243572 +0000 UTC m=+1735.574494827" lastFinishedPulling="2025-12-03 12:43:39.898248939 +0000 UTC m=+1738.567500194" observedRunningTime="2025-12-03 12:43:40.825324477 +0000 UTC m=+1739.494575732" watchObservedRunningTime="2025-12-03 12:43:41.847109819 +0000 UTC m=+1740.516361074" Dec 03 12:43:41 crc kubenswrapper[4711]: I1203 12:43:41.964937 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-b29vt"] Dec 03 12:43:42 crc kubenswrapper[4711]: I1203 12:43:42.774136 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-pvgpj"] Dec 03 12:43:42 crc kubenswrapper[4711]: I1203 12:43:42.775455 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:42 crc kubenswrapper[4711]: I1203 12:43:42.781677 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-pvgpj"] Dec 03 12:43:42 crc kubenswrapper[4711]: I1203 12:43:42.824670 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-index-b29vt" podUID="c81f51fc-d96d-41de-8092-16a361a1e1cc" containerName="registry-server" containerID="cri-o://bc19013a7aa9ed950df687fd85eb1aefadf56c717b737013fbaf70cf7a7db10b" gracePeriod=2 Dec 03 12:43:42 crc kubenswrapper[4711]: I1203 12:43:42.943870 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n4v5\" (UniqueName: \"kubernetes.io/projected/174cadcb-4175-478d-a8d1-0614f02a4cc2-kube-api-access-7n4v5\") pod \"horizon-operator-index-pvgpj\" (UID: \"174cadcb-4175-478d-a8d1-0614f02a4cc2\") " pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:43 crc kubenswrapper[4711]: I1203 12:43:43.045202 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n4v5\" (UniqueName: \"kubernetes.io/projected/174cadcb-4175-478d-a8d1-0614f02a4cc2-kube-api-access-7n4v5\") pod \"horizon-operator-index-pvgpj\" (UID: \"174cadcb-4175-478d-a8d1-0614f02a4cc2\") " pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:43 crc kubenswrapper[4711]: I1203 12:43:43.066713 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n4v5\" (UniqueName: \"kubernetes.io/projected/174cadcb-4175-478d-a8d1-0614f02a4cc2-kube-api-access-7n4v5\") pod \"horizon-operator-index-pvgpj\" (UID: \"174cadcb-4175-478d-a8d1-0614f02a4cc2\") " pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:43 crc kubenswrapper[4711]: I1203 12:43:43.092671 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:43 crc kubenswrapper[4711]: E1203 12:43:43.430745 4711 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc81f51fc_d96d_41de_8092_16a361a1e1cc.slice/crio-conmon-bc19013a7aa9ed950df687fd85eb1aefadf56c717b737013fbaf70cf7a7db10b.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:43:43 crc kubenswrapper[4711]: I1203 12:43:43.831548 4711 generic.go:334] "Generic (PLEG): container finished" podID="c81f51fc-d96d-41de-8092-16a361a1e1cc" containerID="bc19013a7aa9ed950df687fd85eb1aefadf56c717b737013fbaf70cf7a7db10b" exitCode=0 Dec 03 12:43:43 crc kubenswrapper[4711]: I1203 12:43:43.831592 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-b29vt" event={"ID":"c81f51fc-d96d-41de-8092-16a361a1e1cc","Type":"ContainerDied","Data":"bc19013a7aa9ed950df687fd85eb1aefadf56c717b737013fbaf70cf7a7db10b"} Dec 03 12:43:45 crc kubenswrapper[4711]: I1203 12:43:45.570045 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-lw8pr"] Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.373133 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-rq4cg"] Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.374048 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.378318 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-rq4cg"] Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.498699 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw7t\" (UniqueName: \"kubernetes.io/projected/d23e5068-0a2c-40ac-bc4a-fb8a69961f3f-kube-api-access-2gw7t\") pod \"swift-operator-index-rq4cg\" (UID: \"d23e5068-0a2c-40ac-bc4a-fb8a69961f3f\") " pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.600131 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw7t\" (UniqueName: \"kubernetes.io/projected/d23e5068-0a2c-40ac-bc4a-fb8a69961f3f-kube-api-access-2gw7t\") pod \"swift-operator-index-rq4cg\" (UID: \"d23e5068-0a2c-40ac-bc4a-fb8a69961f3f\") " pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.620764 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw7t\" (UniqueName: \"kubernetes.io/projected/d23e5068-0a2c-40ac-bc4a-fb8a69961f3f-kube-api-access-2gw7t\") pod \"swift-operator-index-rq4cg\" (UID: \"d23e5068-0a2c-40ac-bc4a-fb8a69961f3f\") " pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.688388 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-b29vt" Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.697811 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.869708 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-b29vt" Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.877937 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-b29vt" event={"ID":"c81f51fc-d96d-41de-8092-16a361a1e1cc","Type":"ContainerDied","Data":"3b5eeece610102e3f9d78e60263bab630263da1843c8686b077d43d43872ae46"} Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.877983 4711 scope.go:117] "RemoveContainer" containerID="bc19013a7aa9ed950df687fd85eb1aefadf56c717b737013fbaf70cf7a7db10b" Dec 03 12:43:46 crc kubenswrapper[4711]: I1203 12:43:46.878096 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-b29vt" Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.004931 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drx9g\" (UniqueName: \"kubernetes.io/projected/c81f51fc-d96d-41de-8092-16a361a1e1cc-kube-api-access-drx9g\") pod \"c81f51fc-d96d-41de-8092-16a361a1e1cc\" (UID: \"c81f51fc-d96d-41de-8092-16a361a1e1cc\") " Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.009743 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81f51fc-d96d-41de-8092-16a361a1e1cc-kube-api-access-drx9g" (OuterVolumeSpecName: "kube-api-access-drx9g") pod "c81f51fc-d96d-41de-8092-16a361a1e1cc" (UID: "c81f51fc-d96d-41de-8092-16a361a1e1cc"). InnerVolumeSpecName "kube-api-access-drx9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.106787 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drx9g\" (UniqueName: \"kubernetes.io/projected/c81f51fc-d96d-41de-8092-16a361a1e1cc-kube-api-access-drx9g\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.175384 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-rq4cg"] Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.216113 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-b29vt"] Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.223054 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-index-b29vt"] Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.231679 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-pvgpj"] Dec 03 12:43:47 crc kubenswrapper[4711]: W1203 12:43:47.235685 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod174cadcb_4175_478d_a8d1_0614f02a4cc2.slice/crio-f8e47cf6339b92b638c3a7abb79f5ac18c8cb4d219248c9b606ebc605fccf452 WatchSource:0}: Error finding container f8e47cf6339b92b638c3a7abb79f5ac18c8cb4d219248c9b606ebc605fccf452: Status 404 returned error can't find the container with id f8e47cf6339b92b638c3a7abb79f5ac18c8cb4d219248c9b606ebc605fccf452 Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.817927 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:43:47 crc kubenswrapper[4711]: E1203 12:43:47.818787 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.826375 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c81f51fc-d96d-41de-8092-16a361a1e1cc" path="/var/lib/kubelet/pods/c81f51fc-d96d-41de-8092-16a361a1e1cc/volumes" Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.885653 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-pvgpj" event={"ID":"174cadcb-4175-478d-a8d1-0614f02a4cc2","Type":"ContainerStarted","Data":"f8e47cf6339b92b638c3a7abb79f5ac18c8cb4d219248c9b606ebc605fccf452"} Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.890203 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" event={"ID":"6f3afe72-bd28-46b4-9956-be4f9727c405","Type":"ContainerStarted","Data":"0bd17b4a32a5e2f6b06fe36d4071ff0e6b533d3d83ae46f77836695173b59f8d"} Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.892042 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-rq4cg" event={"ID":"d23e5068-0a2c-40ac-bc4a-fb8a69961f3f","Type":"ContainerStarted","Data":"f2f3a182f4e960234f1bbdc5d68e93ca3137f4ab3a5999f3da7478e8d53ca91f"} Dec 03 12:43:47 crc kubenswrapper[4711]: I1203 12:43:47.914585 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" podStartSLOduration=3.022428515 podStartE2EDuration="9.914566711s" podCreationTimestamp="2025-12-03 12:43:38 +0000 UTC" firstStartedPulling="2025-12-03 12:43:39.899024901 +0000 UTC m=+1738.568276156" lastFinishedPulling="2025-12-03 12:43:46.791163107 +0000 UTC m=+1745.460414352" observedRunningTime="2025-12-03 12:43:47.912283198 +0000 UTC m=+1746.581534453" watchObservedRunningTime="2025-12-03 12:43:47.914566711 +0000 UTC m=+1746.583817966" Dec 03 12:43:48 crc kubenswrapper[4711]: I1203 12:43:48.898493 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-rq4cg" event={"ID":"d23e5068-0a2c-40ac-bc4a-fb8a69961f3f","Type":"ContainerStarted","Data":"e4fbaf1901cbe0ce9092ae4669369216faf9c78ad9cc84b274c4718793da9571"} Dec 03 12:43:48 crc kubenswrapper[4711]: I1203 12:43:48.900161 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-pvgpj" event={"ID":"174cadcb-4175-478d-a8d1-0614f02a4cc2","Type":"ContainerStarted","Data":"9b63334ead4deb93c725a0a7e2ce64e9326a5c8546b7873567f155dcce36672a"} Dec 03 12:43:48 crc kubenswrapper[4711]: I1203 12:43:48.901955 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-lw8pr" event={"ID":"410cf095-14f3-44ce-8308-1af8c60b99db","Type":"ContainerStarted","Data":"1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad"} Dec 03 12:43:48 crc kubenswrapper[4711]: I1203 12:43:48.902180 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-lw8pr" podUID="410cf095-14f3-44ce-8308-1af8c60b99db" containerName="registry-server" containerID="cri-o://1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad" gracePeriod=2 Dec 03 12:43:48 crc kubenswrapper[4711]: I1203 12:43:48.916890 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-rq4cg" podStartSLOduration=2.076166099 podStartE2EDuration="2.916868045s" podCreationTimestamp="2025-12-03 12:43:46 +0000 UTC" firstStartedPulling="2025-12-03 12:43:47.187669726 +0000 UTC m=+1745.856920981" lastFinishedPulling="2025-12-03 12:43:48.028371672 +0000 UTC m=+1746.697622927" observedRunningTime="2025-12-03 12:43:48.916010661 +0000 UTC m=+1747.585261926" watchObservedRunningTime="2025-12-03 12:43:48.916868045 +0000 UTC m=+1747.586119300" Dec 03 12:43:48 crc kubenswrapper[4711]: I1203 12:43:48.932315 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-lw8pr" podStartSLOduration=2.716903376 podStartE2EDuration="9.93229971s" podCreationTimestamp="2025-12-03 12:43:39 +0000 UTC" firstStartedPulling="2025-12-03 12:43:40.518007519 +0000 UTC m=+1739.187258774" lastFinishedPulling="2025-12-03 12:43:47.733403853 +0000 UTC m=+1746.402655108" observedRunningTime="2025-12-03 12:43:48.929728939 +0000 UTC m=+1747.598980204" watchObservedRunningTime="2025-12-03 12:43:48.93229971 +0000 UTC m=+1747.601550965" Dec 03 12:43:48 crc kubenswrapper[4711]: I1203 12:43:48.947050 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-pvgpj" podStartSLOduration=6.32375341 podStartE2EDuration="6.947028267s" podCreationTimestamp="2025-12-03 12:43:42 +0000 UTC" firstStartedPulling="2025-12-03 12:43:47.239823335 +0000 UTC m=+1745.909074590" lastFinishedPulling="2025-12-03 12:43:47.863098192 +0000 UTC m=+1746.532349447" observedRunningTime="2025-12-03 12:43:48.941732171 +0000 UTC m=+1747.610983436" watchObservedRunningTime="2025-12-03 12:43:48.947028267 +0000 UTC m=+1747.616279532" Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.283977 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-lw8pr" Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.442092 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mngrd\" (UniqueName: \"kubernetes.io/projected/410cf095-14f3-44ce-8308-1af8c60b99db-kube-api-access-mngrd\") pod \"410cf095-14f3-44ce-8308-1af8c60b99db\" (UID: \"410cf095-14f3-44ce-8308-1af8c60b99db\") " Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.446887 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410cf095-14f3-44ce-8308-1af8c60b99db-kube-api-access-mngrd" (OuterVolumeSpecName: "kube-api-access-mngrd") pod "410cf095-14f3-44ce-8308-1af8c60b99db" (UID: "410cf095-14f3-44ce-8308-1af8c60b99db"). InnerVolumeSpecName "kube-api-access-mngrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.544010 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mngrd\" (UniqueName: \"kubernetes.io/projected/410cf095-14f3-44ce-8308-1af8c60b99db-kube-api-access-mngrd\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.911842 4711 generic.go:334] "Generic (PLEG): container finished" podID="410cf095-14f3-44ce-8308-1af8c60b99db" containerID="1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad" exitCode=0 Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.911885 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-lw8pr" event={"ID":"410cf095-14f3-44ce-8308-1af8c60b99db","Type":"ContainerDied","Data":"1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad"} Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.911935 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-lw8pr" Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.911962 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-lw8pr" event={"ID":"410cf095-14f3-44ce-8308-1af8c60b99db","Type":"ContainerDied","Data":"7c4d25b79b021639a8a3148132b1339fc30ab12cef1502eeb13ea3c4954f39b7"} Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.911990 4711 scope.go:117] "RemoveContainer" containerID="1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad" Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.932846 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-lw8pr"] Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.944197 4711 scope.go:117] "RemoveContainer" containerID="1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad" Dec 03 12:43:49 crc kubenswrapper[4711]: E1203 12:43:49.945032 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad\": container with ID starting with 1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad not found: ID does not exist" containerID="1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad" Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.945107 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad"} err="failed to get container status \"1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad\": rpc error: code = NotFound desc = could not find container \"1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad\": container with ID starting with 1c6640d0d25a3265c280809492547cea080b384cd0d99b502f39e35a29251cad not found: ID does not exist" Dec 03 12:43:49 crc kubenswrapper[4711]: I1203 12:43:49.951554 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-lw8pr"] Dec 03 12:43:50 crc kubenswrapper[4711]: I1203 12:43:50.921199 4711 generic.go:334] "Generic (PLEG): container finished" podID="6f3afe72-bd28-46b4-9956-be4f9727c405" containerID="0bd17b4a32a5e2f6b06fe36d4071ff0e6b533d3d83ae46f77836695173b59f8d" exitCode=0 Dec 03 12:43:50 crc kubenswrapper[4711]: I1203 12:43:50.921236 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" event={"ID":"6f3afe72-bd28-46b4-9956-be4f9727c405","Type":"ContainerDied","Data":"0bd17b4a32a5e2f6b06fe36d4071ff0e6b533d3d83ae46f77836695173b59f8d"} Dec 03 12:43:51 crc kubenswrapper[4711]: I1203 12:43:51.823956 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410cf095-14f3-44ce-8308-1af8c60b99db" path="/var/lib/kubelet/pods/410cf095-14f3-44ce-8308-1af8c60b99db/volumes" Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.206750 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.285243 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3afe72-bd28-46b4-9956-be4f9727c405-config-data\") pod \"6f3afe72-bd28-46b4-9956-be4f9727c405\" (UID: \"6f3afe72-bd28-46b4-9956-be4f9727c405\") " Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.285360 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z5f9\" (UniqueName: \"kubernetes.io/projected/6f3afe72-bd28-46b4-9956-be4f9727c405-kube-api-access-6z5f9\") pod \"6f3afe72-bd28-46b4-9956-be4f9727c405\" (UID: \"6f3afe72-bd28-46b4-9956-be4f9727c405\") " Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.290734 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3afe72-bd28-46b4-9956-be4f9727c405-kube-api-access-6z5f9" (OuterVolumeSpecName: "kube-api-access-6z5f9") pod "6f3afe72-bd28-46b4-9956-be4f9727c405" (UID: "6f3afe72-bd28-46b4-9956-be4f9727c405"). InnerVolumeSpecName "kube-api-access-6z5f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.341492 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3afe72-bd28-46b4-9956-be4f9727c405-config-data" (OuterVolumeSpecName: "config-data") pod "6f3afe72-bd28-46b4-9956-be4f9727c405" (UID: "6f3afe72-bd28-46b4-9956-be4f9727c405"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.387333 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3afe72-bd28-46b4-9956-be4f9727c405-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.387369 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z5f9\" (UniqueName: \"kubernetes.io/projected/6f3afe72-bd28-46b4-9956-be4f9727c405-kube-api-access-6z5f9\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.945556 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" event={"ID":"6f3afe72-bd28-46b4-9956-be4f9727c405","Type":"ContainerDied","Data":"b163129c09a65f7c17af33492b5357aa59b5528fc927fd79938ab4d022c58dac"} Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.945600 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b163129c09a65f7c17af33492b5357aa59b5528fc927fd79938ab4d022c58dac" Dec 03 12:43:52 crc kubenswrapper[4711]: I1203 12:43:52.945621 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-k5dbg" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.093016 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.093057 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.119714 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-8246r"] Dec 03 12:43:53 crc kubenswrapper[4711]: E1203 12:43:53.121221 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3afe72-bd28-46b4-9956-be4f9727c405" containerName="keystone-db-sync" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.121247 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3afe72-bd28-46b4-9956-be4f9727c405" containerName="keystone-db-sync" Dec 03 12:43:53 crc kubenswrapper[4711]: E1203 12:43:53.121281 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81f51fc-d96d-41de-8092-16a361a1e1cc" containerName="registry-server" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.121290 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81f51fc-d96d-41de-8092-16a361a1e1cc" containerName="registry-server" Dec 03 12:43:53 crc kubenswrapper[4711]: E1203 12:43:53.121316 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410cf095-14f3-44ce-8308-1af8c60b99db" containerName="registry-server" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.121325 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="410cf095-14f3-44ce-8308-1af8c60b99db" containerName="registry-server" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.121461 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="410cf095-14f3-44ce-8308-1af8c60b99db" containerName="registry-server" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.121476 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3afe72-bd28-46b4-9956-be4f9727c405" containerName="keystone-db-sync" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.121491 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81f51fc-d96d-41de-8092-16a361a1e1cc" containerName="registry-server" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.122027 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.125652 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.125813 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.125867 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-89xf2" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.126012 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.127254 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.138219 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-8246r"] Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.164779 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.199757 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-scripts\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.199802 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x94f2\" (UniqueName: \"kubernetes.io/projected/a7624667-e238-4d33-940f-34d33d029ad3-kube-api-access-x94f2\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.200047 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-config-data\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.200084 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-credential-keys\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.200121 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-fernet-keys\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.301502 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-scripts\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.302095 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x94f2\" (UniqueName: \"kubernetes.io/projected/a7624667-e238-4d33-940f-34d33d029ad3-kube-api-access-x94f2\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.302203 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-config-data\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.302234 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-credential-keys\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.302265 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-fernet-keys\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.306170 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-credential-keys\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.306446 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-scripts\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.306497 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-fernet-keys\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.306727 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-config-data\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.322634 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x94f2\" (UniqueName: \"kubernetes.io/projected/a7624667-e238-4d33-940f-34d33d029ad3-kube-api-access-x94f2\") pod \"keystone-bootstrap-8246r\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.445735 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.854449 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-8246r"] Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.953411 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-8246r" event={"ID":"a7624667-e238-4d33-940f-34d33d029ad3","Type":"ContainerStarted","Data":"0ad22b9c4605e4c1ed0e33a962f79750887cfd36cc3e5d01e22d1cf95cfa7180"} Dec 03 12:43:53 crc kubenswrapper[4711]: I1203 12:43:53.985479 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-pvgpj" Dec 03 12:43:54 crc kubenswrapper[4711]: I1203 12:43:54.960899 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-8246r" event={"ID":"a7624667-e238-4d33-940f-34d33d029ad3","Type":"ContainerStarted","Data":"e89a7a0e7372b219b5ad194473c98fa7b6d96b5a2f0da2445e19c04fabe75d17"} Dec 03 12:43:54 crc kubenswrapper[4711]: I1203 12:43:54.980463 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-8246r" podStartSLOduration=1.98044609 podStartE2EDuration="1.98044609s" podCreationTimestamp="2025-12-03 12:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:43:54.97572816 +0000 UTC m=+1753.644979425" watchObservedRunningTime="2025-12-03 12:43:54.98044609 +0000 UTC m=+1753.649697345" Dec 03 12:43:56 crc kubenswrapper[4711]: I1203 12:43:56.699472 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:56 crc kubenswrapper[4711]: I1203 12:43:56.699524 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:56 crc kubenswrapper[4711]: I1203 12:43:56.726678 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:57 crc kubenswrapper[4711]: I1203 12:43:57.004437 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-rq4cg" Dec 03 12:43:57 crc kubenswrapper[4711]: I1203 12:43:57.984255 4711 generic.go:334] "Generic (PLEG): container finished" podID="a7624667-e238-4d33-940f-34d33d029ad3" containerID="e89a7a0e7372b219b5ad194473c98fa7b6d96b5a2f0da2445e19c04fabe75d17" exitCode=0 Dec 03 12:43:57 crc kubenswrapper[4711]: I1203 12:43:57.984378 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-8246r" event={"ID":"a7624667-e238-4d33-940f-34d33d029ad3","Type":"ContainerDied","Data":"e89a7a0e7372b219b5ad194473c98fa7b6d96b5a2f0da2445e19c04fabe75d17"} Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.284796 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.393462 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-scripts\") pod \"a7624667-e238-4d33-940f-34d33d029ad3\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.393552 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-credential-keys\") pod \"a7624667-e238-4d33-940f-34d33d029ad3\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.393640 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-config-data\") pod \"a7624667-e238-4d33-940f-34d33d029ad3\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.393698 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-fernet-keys\") pod \"a7624667-e238-4d33-940f-34d33d029ad3\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.393739 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x94f2\" (UniqueName: \"kubernetes.io/projected/a7624667-e238-4d33-940f-34d33d029ad3-kube-api-access-x94f2\") pod \"a7624667-e238-4d33-940f-34d33d029ad3\" (UID: \"a7624667-e238-4d33-940f-34d33d029ad3\") " Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.408980 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7624667-e238-4d33-940f-34d33d029ad3-kube-api-access-x94f2" (OuterVolumeSpecName: "kube-api-access-x94f2") pod "a7624667-e238-4d33-940f-34d33d029ad3" (UID: "a7624667-e238-4d33-940f-34d33d029ad3"). InnerVolumeSpecName "kube-api-access-x94f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.409356 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a7624667-e238-4d33-940f-34d33d029ad3" (UID: "a7624667-e238-4d33-940f-34d33d029ad3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.411588 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a7624667-e238-4d33-940f-34d33d029ad3" (UID: "a7624667-e238-4d33-940f-34d33d029ad3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.412060 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-scripts" (OuterVolumeSpecName: "scripts") pod "a7624667-e238-4d33-940f-34d33d029ad3" (UID: "a7624667-e238-4d33-940f-34d33d029ad3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.430235 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-config-data" (OuterVolumeSpecName: "config-data") pod "a7624667-e238-4d33-940f-34d33d029ad3" (UID: "a7624667-e238-4d33-940f-34d33d029ad3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.495586 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.495628 4711 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.495640 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.495649 4711 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7624667-e238-4d33-940f-34d33d029ad3-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.495661 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x94f2\" (UniqueName: \"kubernetes.io/projected/a7624667-e238-4d33-940f-34d33d029ad3-kube-api-access-x94f2\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.999516 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-8246r" event={"ID":"a7624667-e238-4d33-940f-34d33d029ad3","Type":"ContainerDied","Data":"0ad22b9c4605e4c1ed0e33a962f79750887cfd36cc3e5d01e22d1cf95cfa7180"} Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.999572 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad22b9c4605e4c1ed0e33a962f79750887cfd36cc3e5d01e22d1cf95cfa7180" Dec 03 12:43:59 crc kubenswrapper[4711]: I1203 12:43:59.999646 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-8246r" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.195735 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-6884479545-5vgqj"] Dec 03 12:44:00 crc kubenswrapper[4711]: E1203 12:44:00.195991 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7624667-e238-4d33-940f-34d33d029ad3" containerName="keystone-bootstrap" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.196005 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7624667-e238-4d33-940f-34d33d029ad3" containerName="keystone-bootstrap" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.196116 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7624667-e238-4d33-940f-34d33d029ad3" containerName="keystone-bootstrap" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.196518 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.198977 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-89xf2" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.199394 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.199401 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.209403 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.212048 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-6884479545-5vgqj"] Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.307008 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-fernet-keys\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.307055 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-credential-keys\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.307087 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-scripts\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.307181 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmrtx\" (UniqueName: \"kubernetes.io/projected/a5fda40f-b786-4b92-b502-f8457a37d2aa-kube-api-access-fmrtx\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.307231 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-config-data\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.408310 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-fernet-keys\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.408378 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-credential-keys\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.408431 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-scripts\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.408495 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmrtx\" (UniqueName: \"kubernetes.io/projected/a5fda40f-b786-4b92-b502-f8457a37d2aa-kube-api-access-fmrtx\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.408567 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-config-data\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.413268 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-scripts\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.413744 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-fernet-keys\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.414201 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-config-data\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.414846 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a5fda40f-b786-4b92-b502-f8457a37d2aa-credential-keys\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.429576 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmrtx\" (UniqueName: \"kubernetes.io/projected/a5fda40f-b786-4b92-b502-f8457a37d2aa-kube-api-access-fmrtx\") pod \"keystone-6884479545-5vgqj\" (UID: \"a5fda40f-b786-4b92-b502-f8457a37d2aa\") " pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.521776 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:00 crc kubenswrapper[4711]: I1203 12:44:00.960584 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-6884479545-5vgqj"] Dec 03 12:44:00 crc kubenswrapper[4711]: W1203 12:44:00.969231 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5fda40f_b786_4b92_b502_f8457a37d2aa.slice/crio-c745f2021ac4213907a61c5ed0ab33eb3c9a2c6f6c4776ceae6d275f02f3a17e WatchSource:0}: Error finding container c745f2021ac4213907a61c5ed0ab33eb3c9a2c6f6c4776ceae6d275f02f3a17e: Status 404 returned error can't find the container with id c745f2021ac4213907a61c5ed0ab33eb3c9a2c6f6c4776ceae6d275f02f3a17e Dec 03 12:44:01 crc kubenswrapper[4711]: I1203 12:44:01.007651 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-6884479545-5vgqj" event={"ID":"a5fda40f-b786-4b92-b502-f8457a37d2aa","Type":"ContainerStarted","Data":"c745f2021ac4213907a61c5ed0ab33eb3c9a2c6f6c4776ceae6d275f02f3a17e"} Dec 03 12:44:01 crc kubenswrapper[4711]: I1203 12:44:01.822412 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:44:01 crc kubenswrapper[4711]: E1203 12:44:01.823081 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:44:02 crc kubenswrapper[4711]: I1203 12:44:02.018825 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-6884479545-5vgqj" event={"ID":"a5fda40f-b786-4b92-b502-f8457a37d2aa","Type":"ContainerStarted","Data":"11a870143ceb82d5332ae0c3f6e9613cc823042ce52ce1b5834ac4ca32df54f1"} Dec 03 12:44:02 crc kubenswrapper[4711]: I1203 12:44:02.019218 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:02 crc kubenswrapper[4711]: I1203 12:44:02.047617 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-6884479545-5vgqj" podStartSLOduration=2.047589844 podStartE2EDuration="2.047589844s" podCreationTimestamp="2025-12-03 12:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:02.040317073 +0000 UTC m=+1760.709568368" watchObservedRunningTime="2025-12-03 12:44:02.047589844 +0000 UTC m=+1760.716841109" Dec 03 12:44:12 crc kubenswrapper[4711]: I1203 12:44:12.817945 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:44:12 crc kubenswrapper[4711]: E1203 12:44:12.818653 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:44:12 crc kubenswrapper[4711]: I1203 12:44:12.825734 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82"] Dec 03 12:44:12 crc kubenswrapper[4711]: I1203 12:44:12.827431 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:12 crc kubenswrapper[4711]: I1203 12:44:12.830011 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gw8hz" Dec 03 12:44:12 crc kubenswrapper[4711]: I1203 12:44:12.831368 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82"] Dec 03 12:44:12 crc kubenswrapper[4711]: I1203 12:44:12.918616 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:12 crc kubenswrapper[4711]: I1203 12:44:12.918818 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2ptk\" (UniqueName: \"kubernetes.io/projected/3bcd3a17-8baf-4419-89f9-ae37d4f14176-kube-api-access-n2ptk\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:12 crc kubenswrapper[4711]: I1203 12:44:12.918882 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.020795 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.020971 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2ptk\" (UniqueName: \"kubernetes.io/projected/3bcd3a17-8baf-4419-89f9-ae37d4f14176-kube-api-access-n2ptk\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.021023 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.021704 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.021714 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.046122 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2ptk\" (UniqueName: \"kubernetes.io/projected/3bcd3a17-8baf-4419-89f9-ae37d4f14176-kube-api-access-n2ptk\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.151972 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.539180 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82"] Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.920727 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq"] Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.922198 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:13 crc kubenswrapper[4711]: I1203 12:44:13.928037 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq"] Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.034822 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-bundle\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.034898 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-util\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.035078 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhpcd\" (UniqueName: \"kubernetes.io/projected/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-kube-api-access-qhpcd\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.107771 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" event={"ID":"3bcd3a17-8baf-4419-89f9-ae37d4f14176","Type":"ContainerStarted","Data":"4c576ba00ee02c750e100092a97aac560c2e3b48fc22022c640dca868f32b0bc"} Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.136663 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-util\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.136788 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhpcd\" (UniqueName: \"kubernetes.io/projected/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-kube-api-access-qhpcd\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.136938 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-bundle\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.137704 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-bundle\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.138127 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-util\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.162658 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhpcd\" (UniqueName: \"kubernetes.io/projected/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-kube-api-access-qhpcd\") pod \"eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.242813 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:14 crc kubenswrapper[4711]: I1203 12:44:14.681053 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq"] Dec 03 12:44:15 crc kubenswrapper[4711]: I1203 12:44:15.123226 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" event={"ID":"3bcd3a17-8baf-4419-89f9-ae37d4f14176","Type":"ContainerStarted","Data":"f89839d6b4801d0c1caccd5049c902637a78351dfa73fa5c52df1b5d7dd61857"} Dec 03 12:44:15 crc kubenswrapper[4711]: I1203 12:44:15.125519 4711 generic.go:334] "Generic (PLEG): container finished" podID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerID="3d2c933550aff0d088a0f8d473577afff19366c7ec8b3a166de33790f2262b8d" exitCode=0 Dec 03 12:44:15 crc kubenswrapper[4711]: I1203 12:44:15.125587 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" event={"ID":"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b","Type":"ContainerDied","Data":"3d2c933550aff0d088a0f8d473577afff19366c7ec8b3a166de33790f2262b8d"} Dec 03 12:44:15 crc kubenswrapper[4711]: I1203 12:44:15.126030 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" event={"ID":"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b","Type":"ContainerStarted","Data":"8df694365598b2825fe76bc459103b79c950fa894050551f36f2bdcd296b273d"} Dec 03 12:44:16 crc kubenswrapper[4711]: I1203 12:44:16.137146 4711 generic.go:334] "Generic (PLEG): container finished" podID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerID="f89839d6b4801d0c1caccd5049c902637a78351dfa73fa5c52df1b5d7dd61857" exitCode=0 Dec 03 12:44:16 crc kubenswrapper[4711]: I1203 12:44:16.137199 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" event={"ID":"3bcd3a17-8baf-4419-89f9-ae37d4f14176","Type":"ContainerDied","Data":"f89839d6b4801d0c1caccd5049c902637a78351dfa73fa5c52df1b5d7dd61857"} Dec 03 12:44:17 crc kubenswrapper[4711]: I1203 12:44:17.145534 4711 generic.go:334] "Generic (PLEG): container finished" podID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerID="d24766c61e4668ed16869f049f2c3b9de47c5f3a861bf13259e6fe39a48d0c77" exitCode=0 Dec 03 12:44:17 crc kubenswrapper[4711]: I1203 12:44:17.145724 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" event={"ID":"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b","Type":"ContainerDied","Data":"d24766c61e4668ed16869f049f2c3b9de47c5f3a861bf13259e6fe39a48d0c77"} Dec 03 12:44:17 crc kubenswrapper[4711]: I1203 12:44:17.150189 4711 generic.go:334] "Generic (PLEG): container finished" podID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerID="b8e74898df49770cf14b15120271d5ee4493080904a87be09c95bc6425247fde" exitCode=0 Dec 03 12:44:17 crc kubenswrapper[4711]: I1203 12:44:17.150246 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" event={"ID":"3bcd3a17-8baf-4419-89f9-ae37d4f14176","Type":"ContainerDied","Data":"b8e74898df49770cf14b15120271d5ee4493080904a87be09c95bc6425247fde"} Dec 03 12:44:18 crc kubenswrapper[4711]: I1203 12:44:18.160946 4711 generic.go:334] "Generic (PLEG): container finished" podID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerID="5b60b74c3f5b8f3bc4765f8321fa40fc8cd06534e3db1537a28bb24742c9dbd5" exitCode=0 Dec 03 12:44:18 crc kubenswrapper[4711]: I1203 12:44:18.160984 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" event={"ID":"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b","Type":"ContainerDied","Data":"5b60b74c3f5b8f3bc4765f8321fa40fc8cd06534e3db1537a28bb24742c9dbd5"} Dec 03 12:44:18 crc kubenswrapper[4711]: I1203 12:44:18.166094 4711 generic.go:334] "Generic (PLEG): container finished" podID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerID="9f72bb28f00859ad7b6bdf5639272354baa27a4673c0a5fce4de29865897c2e5" exitCode=0 Dec 03 12:44:18 crc kubenswrapper[4711]: I1203 12:44:18.166154 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" event={"ID":"3bcd3a17-8baf-4419-89f9-ae37d4f14176","Type":"ContainerDied","Data":"9f72bb28f00859ad7b6bdf5639272354baa27a4673c0a5fce4de29865897c2e5"} Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.497534 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.502742 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.553265 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-util\") pod \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.553360 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhpcd\" (UniqueName: \"kubernetes.io/projected/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-kube-api-access-qhpcd\") pod \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.553431 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-bundle\") pod \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.553463 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2ptk\" (UniqueName: \"kubernetes.io/projected/3bcd3a17-8baf-4419-89f9-ae37d4f14176-kube-api-access-n2ptk\") pod \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.553515 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-bundle\") pod \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\" (UID: \"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b\") " Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.553600 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-util\") pod \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\" (UID: \"3bcd3a17-8baf-4419-89f9-ae37d4f14176\") " Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.556529 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-bundle" (OuterVolumeSpecName: "bundle") pod "3bcd3a17-8baf-4419-89f9-ae37d4f14176" (UID: "3bcd3a17-8baf-4419-89f9-ae37d4f14176"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.561289 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-bundle" (OuterVolumeSpecName: "bundle") pod "1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" (UID: "1b5c69ae-9e6e-431f-adf9-6f6b40e6254b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.566782 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bcd3a17-8baf-4419-89f9-ae37d4f14176-kube-api-access-n2ptk" (OuterVolumeSpecName: "kube-api-access-n2ptk") pod "3bcd3a17-8baf-4419-89f9-ae37d4f14176" (UID: "3bcd3a17-8baf-4419-89f9-ae37d4f14176"). InnerVolumeSpecName "kube-api-access-n2ptk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.567320 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-kube-api-access-qhpcd" (OuterVolumeSpecName: "kube-api-access-qhpcd") pod "1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" (UID: "1b5c69ae-9e6e-431f-adf9-6f6b40e6254b"). InnerVolumeSpecName "kube-api-access-qhpcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.571323 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-util" (OuterVolumeSpecName: "util") pod "3bcd3a17-8baf-4419-89f9-ae37d4f14176" (UID: "3bcd3a17-8baf-4419-89f9-ae37d4f14176"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.572510 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-util" (OuterVolumeSpecName: "util") pod "1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" (UID: "1b5c69ae-9e6e-431f-adf9-6f6b40e6254b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.655399 4711 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.655686 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhpcd\" (UniqueName: \"kubernetes.io/projected/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-kube-api-access-qhpcd\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.655699 4711 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.655707 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2ptk\" (UniqueName: \"kubernetes.io/projected/3bcd3a17-8baf-4419-89f9-ae37d4f14176-kube-api-access-n2ptk\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.655716 4711 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b5c69ae-9e6e-431f-adf9-6f6b40e6254b-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:19 crc kubenswrapper[4711]: I1203 12:44:19.655729 4711 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bcd3a17-8baf-4419-89f9-ae37d4f14176-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:20 crc kubenswrapper[4711]: I1203 12:44:20.187267 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" event={"ID":"3bcd3a17-8baf-4419-89f9-ae37d4f14176","Type":"ContainerDied","Data":"4c576ba00ee02c750e100092a97aac560c2e3b48fc22022c640dca868f32b0bc"} Dec 03 12:44:20 crc kubenswrapper[4711]: I1203 12:44:20.187332 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c576ba00ee02c750e100092a97aac560c2e3b48fc22022c640dca868f32b0bc" Dec 03 12:44:20 crc kubenswrapper[4711]: I1203 12:44:20.187291 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82" Dec 03 12:44:20 crc kubenswrapper[4711]: I1203 12:44:20.189981 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" event={"ID":"1b5c69ae-9e6e-431f-adf9-6f6b40e6254b","Type":"ContainerDied","Data":"8df694365598b2825fe76bc459103b79c950fa894050551f36f2bdcd296b273d"} Dec 03 12:44:20 crc kubenswrapper[4711]: I1203 12:44:20.190034 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8df694365598b2825fe76bc459103b79c950fa894050551f36f2bdcd296b273d" Dec 03 12:44:20 crc kubenswrapper[4711]: I1203 12:44:20.190053 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq" Dec 03 12:44:24 crc kubenswrapper[4711]: I1203 12:44:24.817269 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:44:24 crc kubenswrapper[4711]: E1203 12:44:24.817990 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:44:32 crc kubenswrapper[4711]: I1203 12:44:32.057536 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-6884479545-5vgqj" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.723170 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh"] Dec 03 12:44:35 crc kubenswrapper[4711]: E1203 12:44:35.723769 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerName="util" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.723785 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerName="util" Dec 03 12:44:35 crc kubenswrapper[4711]: E1203 12:44:35.723799 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerName="util" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.723807 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerName="util" Dec 03 12:44:35 crc kubenswrapper[4711]: E1203 12:44:35.723823 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerName="pull" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.723832 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerName="pull" Dec 03 12:44:35 crc kubenswrapper[4711]: E1203 12:44:35.723841 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerName="pull" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.723849 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerName="pull" Dec 03 12:44:35 crc kubenswrapper[4711]: E1203 12:44:35.723861 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerName="extract" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.723868 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerName="extract" Dec 03 12:44:35 crc kubenswrapper[4711]: E1203 12:44:35.723884 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerName="extract" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.723891 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerName="extract" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.724060 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5c69ae-9e6e-431f-adf9-6f6b40e6254b" containerName="extract" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.724077 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bcd3a17-8baf-4419-89f9-ae37d4f14176" containerName="extract" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.724598 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.726261 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.726357 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-w6nw5" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.740056 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh"] Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.888238 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a18ab91-ac21-48e9-804b-afe295a73e9c-apiservice-cert\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.888689 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a18ab91-ac21-48e9-804b-afe295a73e9c-webhook-cert\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.888813 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pxf8\" (UniqueName: \"kubernetes.io/projected/6a18ab91-ac21-48e9-804b-afe295a73e9c-kube-api-access-9pxf8\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.990202 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a18ab91-ac21-48e9-804b-afe295a73e9c-apiservice-cert\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.990567 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a18ab91-ac21-48e9-804b-afe295a73e9c-webhook-cert\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.990683 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pxf8\" (UniqueName: \"kubernetes.io/projected/6a18ab91-ac21-48e9-804b-afe295a73e9c-kube-api-access-9pxf8\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.997569 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a18ab91-ac21-48e9-804b-afe295a73e9c-apiservice-cert\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:35 crc kubenswrapper[4711]: I1203 12:44:35.997695 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a18ab91-ac21-48e9-804b-afe295a73e9c-webhook-cert\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:36 crc kubenswrapper[4711]: I1203 12:44:36.012715 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pxf8\" (UniqueName: \"kubernetes.io/projected/6a18ab91-ac21-48e9-804b-afe295a73e9c-kube-api-access-9pxf8\") pod \"swift-operator-controller-manager-7fbd454467-l5gnh\" (UID: \"6a18ab91-ac21-48e9-804b-afe295a73e9c\") " pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:36 crc kubenswrapper[4711]: I1203 12:44:36.047381 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:36 crc kubenswrapper[4711]: I1203 12:44:36.635658 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh"] Dec 03 12:44:37 crc kubenswrapper[4711]: I1203 12:44:37.314556 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" event={"ID":"6a18ab91-ac21-48e9-804b-afe295a73e9c","Type":"ContainerStarted","Data":"9b8e96d3e2155e34d2e629211e75663c5ed9c9b996fba2e26fbc01c9a168e980"} Dec 03 12:44:37 crc kubenswrapper[4711]: I1203 12:44:37.817365 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:44:37 crc kubenswrapper[4711]: E1203 12:44:37.817722 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.372735 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" event={"ID":"6a18ab91-ac21-48e9-804b-afe295a73e9c","Type":"ContainerStarted","Data":"5c09533c2addd4e9d8f08f1c4f5c927d6b98dd64cdbf371c62ef37d6de4ef325"} Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.373345 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.399734 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" podStartSLOduration=2.640680287 podStartE2EDuration="9.399714277s" podCreationTimestamp="2025-12-03 12:44:35 +0000 UTC" firstStartedPulling="2025-12-03 12:44:36.641390436 +0000 UTC m=+1795.310641681" lastFinishedPulling="2025-12-03 12:44:43.400424416 +0000 UTC m=+1802.069675671" observedRunningTime="2025-12-03 12:44:44.397681241 +0000 UTC m=+1803.066932506" watchObservedRunningTime="2025-12-03 12:44:44.399714277 +0000 UTC m=+1803.068965542" Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.767680 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48"] Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.769049 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:44 crc kubenswrapper[4711]: W1203 12:44:44.771154 4711 reflector.go:561] object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9rqjg": failed to list *v1.Secret: secrets "horizon-operator-controller-manager-dockercfg-9rqjg" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Dec 03 12:44:44 crc kubenswrapper[4711]: E1203 12:44:44.771193 4711 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"horizon-operator-controller-manager-dockercfg-9rqjg\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"horizon-operator-controller-manager-dockercfg-9rqjg\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 12:44:44 crc kubenswrapper[4711]: W1203 12:44:44.771516 4711 reflector.go:561] object-"openstack-operators"/"horizon-operator-controller-manager-service-cert": failed to list *v1.Secret: secrets "horizon-operator-controller-manager-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Dec 03 12:44:44 crc kubenswrapper[4711]: E1203 12:44:44.771582 4711 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"horizon-operator-controller-manager-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"horizon-operator-controller-manager-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.777167 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48"] Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.923304 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml8n9\" (UniqueName: \"kubernetes.io/projected/7a329276-2ff5-41a6-90a7-5237ce641101-kube-api-access-ml8n9\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.923382 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-webhook-cert\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:44 crc kubenswrapper[4711]: I1203 12:44:44.923410 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-apiservice-cert\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:45 crc kubenswrapper[4711]: I1203 12:44:45.024402 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-apiservice-cert\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:45 crc kubenswrapper[4711]: I1203 12:44:45.024541 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml8n9\" (UniqueName: \"kubernetes.io/projected/7a329276-2ff5-41a6-90a7-5237ce641101-kube-api-access-ml8n9\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:45 crc kubenswrapper[4711]: I1203 12:44:45.024581 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-webhook-cert\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:45 crc kubenswrapper[4711]: I1203 12:44:45.042349 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml8n9\" (UniqueName: \"kubernetes.io/projected/7a329276-2ff5-41a6-90a7-5237ce641101-kube-api-access-ml8n9\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:45 crc kubenswrapper[4711]: I1203 12:44:45.867092 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9rqjg" Dec 03 12:44:46 crc kubenswrapper[4711]: E1203 12:44:46.025515 4711 secret.go:188] Couldn't get secret openstack-operators/horizon-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 12:44:46 crc kubenswrapper[4711]: E1203 12:44:46.025877 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-webhook-cert podName:7a329276-2ff5-41a6-90a7-5237ce641101 nodeName:}" failed. No retries permitted until 2025-12-03 12:44:46.525857683 +0000 UTC m=+1805.195108938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-webhook-cert") pod "horizon-operator-controller-manager-7b5746d556-rrj48" (UID: "7a329276-2ff5-41a6-90a7-5237ce641101") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:44:46 crc kubenswrapper[4711]: E1203 12:44:46.025796 4711 secret.go:188] Couldn't get secret openstack-operators/horizon-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 12:44:46 crc kubenswrapper[4711]: E1203 12:44:46.026462 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-apiservice-cert podName:7a329276-2ff5-41a6-90a7-5237ce641101 nodeName:}" failed. No retries permitted until 2025-12-03 12:44:46.526449889 +0000 UTC m=+1805.195701144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-apiservice-cert") pod "horizon-operator-controller-manager-7b5746d556-rrj48" (UID: "7a329276-2ff5-41a6-90a7-5237ce641101") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:44:46 crc kubenswrapper[4711]: I1203 12:44:46.115679 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Dec 03 12:44:46 crc kubenswrapper[4711]: I1203 12:44:46.560858 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-webhook-cert\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:46 crc kubenswrapper[4711]: I1203 12:44:46.560938 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-apiservice-cert\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:46 crc kubenswrapper[4711]: I1203 12:44:46.565601 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-webhook-cert\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:46 crc kubenswrapper[4711]: I1203 12:44:46.565802 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a329276-2ff5-41a6-90a7-5237ce641101-apiservice-cert\") pod \"horizon-operator-controller-manager-7b5746d556-rrj48\" (UID: \"7a329276-2ff5-41a6-90a7-5237ce641101\") " pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:46 crc kubenswrapper[4711]: I1203 12:44:46.587025 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:46 crc kubenswrapper[4711]: I1203 12:44:46.993937 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48"] Dec 03 12:44:47 crc kubenswrapper[4711]: W1203 12:44:47.003548 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a329276_2ff5_41a6_90a7_5237ce641101.slice/crio-137f62b681e632678fc7b026c18f8cb2b357e58f34ce03ae0d8d1ec99f22821e WatchSource:0}: Error finding container 137f62b681e632678fc7b026c18f8cb2b357e58f34ce03ae0d8d1ec99f22821e: Status 404 returned error can't find the container with id 137f62b681e632678fc7b026c18f8cb2b357e58f34ce03ae0d8d1ec99f22821e Dec 03 12:44:47 crc kubenswrapper[4711]: I1203 12:44:47.391764 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" event={"ID":"7a329276-2ff5-41a6-90a7-5237ce641101","Type":"ContainerStarted","Data":"137f62b681e632678fc7b026c18f8cb2b357e58f34ce03ae0d8d1ec99f22821e"} Dec 03 12:44:49 crc kubenswrapper[4711]: I1203 12:44:49.427308 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" event={"ID":"7a329276-2ff5-41a6-90a7-5237ce641101","Type":"ContainerStarted","Data":"67c6d28714c7f9c2d42ef20e194495f40e54b2ea59b3895d39066439849d7b2f"} Dec 03 12:44:49 crc kubenswrapper[4711]: I1203 12:44:49.427444 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:49 crc kubenswrapper[4711]: I1203 12:44:49.450181 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" podStartSLOduration=3.420217053 podStartE2EDuration="5.45015798s" podCreationTimestamp="2025-12-03 12:44:44 +0000 UTC" firstStartedPulling="2025-12-03 12:44:47.005625375 +0000 UTC m=+1805.674876640" lastFinishedPulling="2025-12-03 12:44:49.035566312 +0000 UTC m=+1807.704817567" observedRunningTime="2025-12-03 12:44:49.443928519 +0000 UTC m=+1808.113179794" watchObservedRunningTime="2025-12-03 12:44:49.45015798 +0000 UTC m=+1808.119409235" Dec 03 12:44:52 crc kubenswrapper[4711]: I1203 12:44:52.819990 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:44:52 crc kubenswrapper[4711]: E1203 12:44:52.820739 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:44:56 crc kubenswrapper[4711]: I1203 12:44:56.052872 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7fbd454467-l5gnh" Dec 03 12:44:56 crc kubenswrapper[4711]: I1203 12:44:56.591337 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7b5746d556-rrj48" Dec 03 12:44:57 crc kubenswrapper[4711]: I1203 12:44:57.981288 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m2tgt"] Dec 03 12:44:57 crc kubenswrapper[4711]: I1203 12:44:57.983136 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:57 crc kubenswrapper[4711]: I1203 12:44:57.988749 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m2tgt"] Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.029574 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-utilities\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.029718 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvblh\" (UniqueName: \"kubernetes.io/projected/800e0d85-0d20-4da5-83a4-e41af85ef6d3-kube-api-access-bvblh\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.029837 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-catalog-content\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.131175 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-utilities\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.131264 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvblh\" (UniqueName: \"kubernetes.io/projected/800e0d85-0d20-4da5-83a4-e41af85ef6d3-kube-api-access-bvblh\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.131309 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-catalog-content\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.131650 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-utilities\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.131830 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-catalog-content\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.165980 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvblh\" (UniqueName: \"kubernetes.io/projected/800e0d85-0d20-4da5-83a4-e41af85ef6d3-kube-api-access-bvblh\") pod \"redhat-operators-m2tgt\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.303262 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:44:58 crc kubenswrapper[4711]: I1203 12:44:58.600612 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m2tgt"] Dec 03 12:44:58 crc kubenswrapper[4711]: W1203 12:44:58.604430 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800e0d85_0d20_4da5_83a4_e41af85ef6d3.slice/crio-2b66dcc207fc4e7417e98bce723bdd3ccb8a7ca8ce020bed18206167fce39c22 WatchSource:0}: Error finding container 2b66dcc207fc4e7417e98bce723bdd3ccb8a7ca8ce020bed18206167fce39c22: Status 404 returned error can't find the container with id 2b66dcc207fc4e7417e98bce723bdd3ccb8a7ca8ce020bed18206167fce39c22 Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.325163 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.329995 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.332445 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.332706 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.332732 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-5tvh8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.332788 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.350868 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.449980 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8773367-0744-42d7-a9ee-8f508d9d9c97-cache\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.450088 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8773367-0744-42d7-a9ee-8f508d9d9c97-lock\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.450115 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.450185 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.450236 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p684f\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-kube-api-access-p684f\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.518527 4711 generic.go:334] "Generic (PLEG): container finished" podID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerID="1ad369ad1faf1d9165a3fd025d1d4492b453915bc2a396e582b4af9da2662382" exitCode=0 Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.518573 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2tgt" event={"ID":"800e0d85-0d20-4da5-83a4-e41af85ef6d3","Type":"ContainerDied","Data":"1ad369ad1faf1d9165a3fd025d1d4492b453915bc2a396e582b4af9da2662382"} Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.518604 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2tgt" event={"ID":"800e0d85-0d20-4da5-83a4-e41af85ef6d3","Type":"ContainerStarted","Data":"2b66dcc207fc4e7417e98bce723bdd3ccb8a7ca8ce020bed18206167fce39c22"} Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.552533 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8773367-0744-42d7-a9ee-8f508d9d9c97-lock\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.552604 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.552747 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.552812 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p684f\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-kube-api-access-p684f\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.552856 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8773367-0744-42d7-a9ee-8f508d9d9c97-cache\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: E1203 12:44:59.553227 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:44:59 crc kubenswrapper[4711]: E1203 12:44:59.553249 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.553265 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8773367-0744-42d7-a9ee-8f508d9d9c97-lock\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: E1203 12:44:59.553314 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift podName:d8773367-0744-42d7-a9ee-8f508d9d9c97 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:00.053295904 +0000 UTC m=+1818.722547159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift") pod "swift-storage-0" (UID: "d8773367-0744-42d7-a9ee-8f508d9d9c97") : configmap "swift-ring-files" not found Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.553344 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8773367-0744-42d7-a9ee-8f508d9d9c97-cache\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.553354 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.573337 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p684f\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-kube-api-access-p684f\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.577247 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.679839 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-h7s4b"] Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.680729 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.683055 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.683104 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.690940 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-h7s4b"] Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.694180 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.730379 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-h7s4b"] Dec 03 12:44:59 crc kubenswrapper[4711]: E1203 12:44:59.731026 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[dispersionconf etc-swift kube-api-access-zdxnt ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[dispersionconf etc-swift kube-api-access-zdxnt ring-data-devices scripts swiftconf]: context canceled" pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" podUID="a1c9d318-e8d2-4d51-9ac3-79941fd37cdb" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.738514 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-tcsg8"] Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.739756 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.760849 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-tcsg8"] Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.857885 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-scripts\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.857974 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5h47\" (UniqueName: \"kubernetes.io/projected/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-kube-api-access-k5h47\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858006 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-dispersionconf\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858069 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-swiftconf\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858243 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-kube-api-access-zdxnt\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858291 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-dispersionconf\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858651 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-swiftconf\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858714 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-ring-data-devices\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858760 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-ring-data-devices\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858790 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-scripts\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858853 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-etc-swift\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.858927 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-etc-swift\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.960733 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-dispersionconf\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.960804 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-kube-api-access-zdxnt\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.960945 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-swiftconf\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.960975 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-ring-data-devices\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961001 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-ring-data-devices\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961056 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-scripts\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961079 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-etc-swift\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961106 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-etc-swift\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961175 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-scripts\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961211 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5h47\" (UniqueName: \"kubernetes.io/projected/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-kube-api-access-k5h47\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961236 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-dispersionconf\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961286 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-swiftconf\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961692 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-etc-swift\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.961985 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-ring-data-devices\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.962110 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-scripts\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.962134 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-etc-swift\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.962361 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-ring-data-devices\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.962964 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-scripts\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.967090 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-swiftconf\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.967466 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-swiftconf\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.970674 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-dispersionconf\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.970849 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-dispersionconf\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.986275 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-kube-api-access-zdxnt\") pod \"swift-ring-rebalance-h7s4b\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:44:59 crc kubenswrapper[4711]: I1203 12:44:59.990442 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5h47\" (UniqueName: \"kubernetes.io/projected/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-kube-api-access-k5h47\") pod \"swift-ring-rebalance-tcsg8\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.052623 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.062198 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:45:00 crc kubenswrapper[4711]: E1203 12:45:00.062352 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:00 crc kubenswrapper[4711]: E1203 12:45:00.062441 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 03 12:45:00 crc kubenswrapper[4711]: E1203 12:45:00.062495 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift podName:d8773367-0744-42d7-a9ee-8f508d9d9c97 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:01.062477109 +0000 UTC m=+1819.731728364 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift") pod "swift-storage-0" (UID: "d8773367-0744-42d7-a9ee-8f508d9d9c97") : configmap "swift-ring-files" not found Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.141006 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk"] Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.142128 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.149086 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.149185 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.155762 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk"] Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.264764 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82024fde-3916-4dd5-bedc-9c1e30fb5973-secret-volume\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.264920 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82024fde-3916-4dd5-bedc-9c1e30fb5973-config-volume\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.264969 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mrkc\" (UniqueName: \"kubernetes.io/projected/82024fde-3916-4dd5-bedc-9c1e30fb5973-kube-api-access-6mrkc\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.365707 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mrkc\" (UniqueName: \"kubernetes.io/projected/82024fde-3916-4dd5-bedc-9c1e30fb5973-kube-api-access-6mrkc\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.365788 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82024fde-3916-4dd5-bedc-9c1e30fb5973-secret-volume\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.365858 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82024fde-3916-4dd5-bedc-9c1e30fb5973-config-volume\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.366787 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82024fde-3916-4dd5-bedc-9c1e30fb5973-config-volume\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.369773 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82024fde-3916-4dd5-bedc-9c1e30fb5973-secret-volume\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.387692 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mrkc\" (UniqueName: \"kubernetes.io/projected/82024fde-3916-4dd5-bedc-9c1e30fb5973-kube-api-access-6mrkc\") pod \"collect-profiles-29412765-hb7rk\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.470120 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.533182 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.554060 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.576248 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-tcsg8"] Dec 03 12:45:00 crc kubenswrapper[4711]: W1203 12:45:00.651900 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f07a4c6_d466_42b4_aa65_5f58fa554bbd.slice/crio-a0b4c2295b0f1a909f634d1458eed878ec0cf4bb8c9c7348db32e7e850bc354d WatchSource:0}: Error finding container a0b4c2295b0f1a909f634d1458eed878ec0cf4bb8c9c7348db32e7e850bc354d: Status 404 returned error can't find the container with id a0b4c2295b0f1a909f634d1458eed878ec0cf4bb8c9c7348db32e7e850bc354d Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.670383 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-kube-api-access-zdxnt\") pod \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.670522 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-ring-data-devices\") pod \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.670584 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-etc-swift\") pod \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.670609 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-dispersionconf\") pod \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.670632 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-scripts\") pod \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.670653 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-swiftconf\") pod \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\" (UID: \"a1c9d318-e8d2-4d51-9ac3-79941fd37cdb\") " Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.671632 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb" (UID: "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.676876 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb" (UID: "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.677387 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb" (UID: "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.677622 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-kube-api-access-zdxnt" (OuterVolumeSpecName: "kube-api-access-zdxnt") pod "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb" (UID: "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb"). InnerVolumeSpecName "kube-api-access-zdxnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.679524 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb" (UID: "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.680338 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-scripts" (OuterVolumeSpecName: "scripts") pod "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb" (UID: "a1c9d318-e8d2-4d51-9ac3-79941fd37cdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.772154 4711 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.772189 4711 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.772200 4711 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.772215 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.772227 4711 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.772238 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb-kube-api-access-zdxnt\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:00 crc kubenswrapper[4711]: I1203 12:45:00.940033 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk"] Dec 03 12:45:00 crc kubenswrapper[4711]: W1203 12:45:00.946121 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82024fde_3916_4dd5_bedc_9c1e30fb5973.slice/crio-27a7401afd6e1e819a7cdf3a0a7ec980d0c1b0b6509027b99a1927e60cc1a6e8 WatchSource:0}: Error finding container 27a7401afd6e1e819a7cdf3a0a7ec980d0c1b0b6509027b99a1927e60cc1a6e8: Status 404 returned error can't find the container with id 27a7401afd6e1e819a7cdf3a0a7ec980d0c1b0b6509027b99a1927e60cc1a6e8 Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.078044 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:45:01 crc kubenswrapper[4711]: E1203 12:45:01.078249 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:01 crc kubenswrapper[4711]: E1203 12:45:01.078379 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 03 12:45:01 crc kubenswrapper[4711]: E1203 12:45:01.078441 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift podName:d8773367-0744-42d7-a9ee-8f508d9d9c97 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:03.078423033 +0000 UTC m=+1821.747674288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift") pod "swift-storage-0" (UID: "d8773367-0744-42d7-a9ee-8f508d9d9c97") : configmap "swift-ring-files" not found Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.545583 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2tgt" event={"ID":"800e0d85-0d20-4da5-83a4-e41af85ef6d3","Type":"ContainerStarted","Data":"0befea12b3f29624b5fdb7e9ec2ea4f1ec9c21f2dc1eab8d98cc94df58e5b0de"} Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.554604 4711 generic.go:334] "Generic (PLEG): container finished" podID="82024fde-3916-4dd5-bedc-9c1e30fb5973" containerID="057cd18240b3aec1428bf92e29f01e18aad57b7a14b39141c37e7c88403546c2" exitCode=0 Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.554714 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" event={"ID":"82024fde-3916-4dd5-bedc-9c1e30fb5973","Type":"ContainerDied","Data":"057cd18240b3aec1428bf92e29f01e18aad57b7a14b39141c37e7c88403546c2"} Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.554750 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" event={"ID":"82024fde-3916-4dd5-bedc-9c1e30fb5973","Type":"ContainerStarted","Data":"27a7401afd6e1e819a7cdf3a0a7ec980d0c1b0b6509027b99a1927e60cc1a6e8"} Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.558147 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-h7s4b" Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.559612 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" event={"ID":"4f07a4c6-d466-42b4-aa65-5f58fa554bbd","Type":"ContainerStarted","Data":"a0b4c2295b0f1a909f634d1458eed878ec0cf4bb8c9c7348db32e7e850bc354d"} Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.637766 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-h7s4b"] Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.644270 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-h7s4b"] Dec 03 12:45:01 crc kubenswrapper[4711]: I1203 12:45:01.841967 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c9d318-e8d2-4d51-9ac3-79941fd37cdb" path="/var/lib/kubelet/pods/a1c9d318-e8d2-4d51-9ac3-79941fd37cdb/volumes" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.504947 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw"] Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.506313 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.525399 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw"] Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.571296 4711 generic.go:334] "Generic (PLEG): container finished" podID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerID="0befea12b3f29624b5fdb7e9ec2ea4f1ec9c21f2dc1eab8d98cc94df58e5b0de" exitCode=0 Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.571393 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2tgt" event={"ID":"800e0d85-0d20-4da5-83a4-e41af85ef6d3","Type":"ContainerDied","Data":"0befea12b3f29624b5fdb7e9ec2ea4f1ec9c21f2dc1eab8d98cc94df58e5b0de"} Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.602768 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a481af3-5cbd-47a8-a9b5-d8d73feab045-log-httpd\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.602844 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.602880 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a481af3-5cbd-47a8-a9b5-d8d73feab045-config-data\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.602999 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np42j\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-kube-api-access-np42j\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.603038 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a481af3-5cbd-47a8-a9b5-d8d73feab045-run-httpd\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.705261 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.705331 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a481af3-5cbd-47a8-a9b5-d8d73feab045-config-data\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.705375 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np42j\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-kube-api-access-np42j\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.705423 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a481af3-5cbd-47a8-a9b5-d8d73feab045-run-httpd\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.705509 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a481af3-5cbd-47a8-a9b5-d8d73feab045-log-httpd\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: E1203 12:45:02.705418 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:02 crc kubenswrapper[4711]: E1203 12:45:02.705774 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw: configmap "swift-ring-files" not found Dec 03 12:45:02 crc kubenswrapper[4711]: E1203 12:45:02.705840 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift podName:8a481af3-5cbd-47a8-a9b5-d8d73feab045 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:03.205819746 +0000 UTC m=+1821.875071001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift") pod "swift-proxy-8cfd9857-v9jnw" (UID: "8a481af3-5cbd-47a8-a9b5-d8d73feab045") : configmap "swift-ring-files" not found Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.706424 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a481af3-5cbd-47a8-a9b5-d8d73feab045-log-httpd\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.706463 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a481af3-5cbd-47a8-a9b5-d8d73feab045-run-httpd\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.712651 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a481af3-5cbd-47a8-a9b5-d8d73feab045-config-data\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:02 crc kubenswrapper[4711]: I1203 12:45:02.726062 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np42j\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-kube-api-access-np42j\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:03 crc kubenswrapper[4711]: I1203 12:45:03.111686 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:45:03 crc kubenswrapper[4711]: E1203 12:45:03.111870 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:03 crc kubenswrapper[4711]: E1203 12:45:03.111887 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 03 12:45:03 crc kubenswrapper[4711]: E1203 12:45:03.111951 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift podName:d8773367-0744-42d7-a9ee-8f508d9d9c97 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:07.111932414 +0000 UTC m=+1825.781183669 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift") pod "swift-storage-0" (UID: "d8773367-0744-42d7-a9ee-8f508d9d9c97") : configmap "swift-ring-files" not found Dec 03 12:45:03 crc kubenswrapper[4711]: I1203 12:45:03.213019 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:03 crc kubenswrapper[4711]: E1203 12:45:03.213161 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:03 crc kubenswrapper[4711]: E1203 12:45:03.213254 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw: configmap "swift-ring-files" not found Dec 03 12:45:03 crc kubenswrapper[4711]: E1203 12:45:03.213304 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift podName:8a481af3-5cbd-47a8-a9b5-d8d73feab045 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:04.213287954 +0000 UTC m=+1822.882539209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift") pod "swift-proxy-8cfd9857-v9jnw" (UID: "8a481af3-5cbd-47a8-a9b5-d8d73feab045") : configmap "swift-ring-files" not found Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.227327 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:04 crc kubenswrapper[4711]: E1203 12:45:04.227590 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:04 crc kubenswrapper[4711]: E1203 12:45:04.227617 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw: configmap "swift-ring-files" not found Dec 03 12:45:04 crc kubenswrapper[4711]: E1203 12:45:04.227693 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift podName:8a481af3-5cbd-47a8-a9b5-d8d73feab045 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:06.227666494 +0000 UTC m=+1824.896917749 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift") pod "swift-proxy-8cfd9857-v9jnw" (UID: "8a481af3-5cbd-47a8-a9b5-d8d73feab045") : configmap "swift-ring-files" not found Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.439557 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.532962 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mrkc\" (UniqueName: \"kubernetes.io/projected/82024fde-3916-4dd5-bedc-9c1e30fb5973-kube-api-access-6mrkc\") pod \"82024fde-3916-4dd5-bedc-9c1e30fb5973\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.533100 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82024fde-3916-4dd5-bedc-9c1e30fb5973-config-volume\") pod \"82024fde-3916-4dd5-bedc-9c1e30fb5973\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.533156 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82024fde-3916-4dd5-bedc-9c1e30fb5973-secret-volume\") pod \"82024fde-3916-4dd5-bedc-9c1e30fb5973\" (UID: \"82024fde-3916-4dd5-bedc-9c1e30fb5973\") " Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.533893 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82024fde-3916-4dd5-bedc-9c1e30fb5973-config-volume" (OuterVolumeSpecName: "config-volume") pod "82024fde-3916-4dd5-bedc-9c1e30fb5973" (UID: "82024fde-3916-4dd5-bedc-9c1e30fb5973"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.536887 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82024fde-3916-4dd5-bedc-9c1e30fb5973-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82024fde-3916-4dd5-bedc-9c1e30fb5973" (UID: "82024fde-3916-4dd5-bedc-9c1e30fb5973"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.537403 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82024fde-3916-4dd5-bedc-9c1e30fb5973-kube-api-access-6mrkc" (OuterVolumeSpecName: "kube-api-access-6mrkc") pod "82024fde-3916-4dd5-bedc-9c1e30fb5973" (UID: "82024fde-3916-4dd5-bedc-9c1e30fb5973"). InnerVolumeSpecName "kube-api-access-6mrkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.588722 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" event={"ID":"82024fde-3916-4dd5-bedc-9c1e30fb5973","Type":"ContainerDied","Data":"27a7401afd6e1e819a7cdf3a0a7ec980d0c1b0b6509027b99a1927e60cc1a6e8"} Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.588769 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27a7401afd6e1e819a7cdf3a0a7ec980d0c1b0b6509027b99a1927e60cc1a6e8" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.588804 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-hb7rk" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.634586 4711 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82024fde-3916-4dd5-bedc-9c1e30fb5973-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.634873 4711 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82024fde-3916-4dd5-bedc-9c1e30fb5973-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.634996 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mrkc\" (UniqueName: \"kubernetes.io/projected/82024fde-3916-4dd5-bedc-9c1e30fb5973-kube-api-access-6mrkc\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:04 crc kubenswrapper[4711]: I1203 12:45:04.817344 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:45:04 crc kubenswrapper[4711]: E1203 12:45:04.817792 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:45:05 crc kubenswrapper[4711]: I1203 12:45:05.598021 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" event={"ID":"4f07a4c6-d466-42b4-aa65-5f58fa554bbd","Type":"ContainerStarted","Data":"3317c5b43d06b3964ee9ebbe456e8c42e3338df817044715619ba464de4c4d44"} Dec 03 12:45:05 crc kubenswrapper[4711]: I1203 12:45:05.601750 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2tgt" event={"ID":"800e0d85-0d20-4da5-83a4-e41af85ef6d3","Type":"ContainerStarted","Data":"7a74767d388777a6decfa8a38990f847da29f2e2b32f26d593139432f63a3253"} Dec 03 12:45:05 crc kubenswrapper[4711]: I1203 12:45:05.633610 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" podStartSLOduration=2.850634037 podStartE2EDuration="6.633584061s" podCreationTimestamp="2025-12-03 12:44:59 +0000 UTC" firstStartedPulling="2025-12-03 12:45:00.670034122 +0000 UTC m=+1819.339285377" lastFinishedPulling="2025-12-03 12:45:04.452984146 +0000 UTC m=+1823.122235401" observedRunningTime="2025-12-03 12:45:05.627338109 +0000 UTC m=+1824.296589384" watchObservedRunningTime="2025-12-03 12:45:05.633584061 +0000 UTC m=+1824.302835316" Dec 03 12:45:05 crc kubenswrapper[4711]: I1203 12:45:05.654631 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m2tgt" podStartSLOduration=3.72157374 podStartE2EDuration="8.654602889s" podCreationTimestamp="2025-12-03 12:44:57 +0000 UTC" firstStartedPulling="2025-12-03 12:44:59.520225644 +0000 UTC m=+1818.189476899" lastFinishedPulling="2025-12-03 12:45:04.453254793 +0000 UTC m=+1823.122506048" observedRunningTime="2025-12-03 12:45:05.645751285 +0000 UTC m=+1824.315002550" watchObservedRunningTime="2025-12-03 12:45:05.654602889 +0000 UTC m=+1824.323854164" Dec 03 12:45:06 crc kubenswrapper[4711]: I1203 12:45:06.258743 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:06 crc kubenswrapper[4711]: E1203 12:45:06.259026 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:06 crc kubenswrapper[4711]: E1203 12:45:06.259050 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw: configmap "swift-ring-files" not found Dec 03 12:45:06 crc kubenswrapper[4711]: E1203 12:45:06.259102 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift podName:8a481af3-5cbd-47a8-a9b5-d8d73feab045 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:10.259086777 +0000 UTC m=+1828.928338032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift") pod "swift-proxy-8cfd9857-v9jnw" (UID: "8a481af3-5cbd-47a8-a9b5-d8d73feab045") : configmap "swift-ring-files" not found Dec 03 12:45:07 crc kubenswrapper[4711]: I1203 12:45:07.172722 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:45:07 crc kubenswrapper[4711]: E1203 12:45:07.173014 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:07 crc kubenswrapper[4711]: E1203 12:45:07.173031 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 03 12:45:07 crc kubenswrapper[4711]: E1203 12:45:07.173079 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift podName:d8773367-0744-42d7-a9ee-8f508d9d9c97 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:15.173062124 +0000 UTC m=+1833.842313369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift") pod "swift-storage-0" (UID: "d8773367-0744-42d7-a9ee-8f508d9d9c97") : configmap "swift-ring-files" not found Dec 03 12:45:08 crc kubenswrapper[4711]: I1203 12:45:08.304520 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:45:08 crc kubenswrapper[4711]: I1203 12:45:08.304833 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:45:09 crc kubenswrapper[4711]: I1203 12:45:09.363856 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m2tgt" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="registry-server" probeResult="failure" output=< Dec 03 12:45:09 crc kubenswrapper[4711]: timeout: failed to connect service ":50051" within 1s Dec 03 12:45:09 crc kubenswrapper[4711]: > Dec 03 12:45:10 crc kubenswrapper[4711]: I1203 12:45:10.322390 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:10 crc kubenswrapper[4711]: E1203 12:45:10.322588 4711 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 03 12:45:10 crc kubenswrapper[4711]: E1203 12:45:10.322808 4711 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw: configmap "swift-ring-files" not found Dec 03 12:45:10 crc kubenswrapper[4711]: E1203 12:45:10.322876 4711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift podName:8a481af3-5cbd-47a8-a9b5-d8d73feab045 nodeName:}" failed. No retries permitted until 2025-12-03 12:45:18.322857641 +0000 UTC m=+1836.992108896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift") pod "swift-proxy-8cfd9857-v9jnw" (UID: "8a481af3-5cbd-47a8-a9b5-d8d73feab045") : configmap "swift-ring-files" not found Dec 03 12:45:11 crc kubenswrapper[4711]: I1203 12:45:11.777585 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-82qq5"] Dec 03 12:45:11 crc kubenswrapper[4711]: E1203 12:45:11.778056 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82024fde-3916-4dd5-bedc-9c1e30fb5973" containerName="collect-profiles" Dec 03 12:45:11 crc kubenswrapper[4711]: I1203 12:45:11.778079 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="82024fde-3916-4dd5-bedc-9c1e30fb5973" containerName="collect-profiles" Dec 03 12:45:11 crc kubenswrapper[4711]: I1203 12:45:11.778301 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="82024fde-3916-4dd5-bedc-9c1e30fb5973" containerName="collect-profiles" Dec 03 12:45:11 crc kubenswrapper[4711]: I1203 12:45:11.779078 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:11 crc kubenswrapper[4711]: I1203 12:45:11.782048 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-bgpfn" Dec 03 12:45:11 crc kubenswrapper[4711]: I1203 12:45:11.794808 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-82qq5"] Dec 03 12:45:11 crc kubenswrapper[4711]: I1203 12:45:11.943716 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p276w\" (UniqueName: \"kubernetes.io/projected/b55ffe9c-6afa-4506-b385-46bc81594479-kube-api-access-p276w\") pod \"glance-operator-index-82qq5\" (UID: \"b55ffe9c-6afa-4506-b385-46bc81594479\") " pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:12 crc kubenswrapper[4711]: I1203 12:45:12.044545 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p276w\" (UniqueName: \"kubernetes.io/projected/b55ffe9c-6afa-4506-b385-46bc81594479-kube-api-access-p276w\") pod \"glance-operator-index-82qq5\" (UID: \"b55ffe9c-6afa-4506-b385-46bc81594479\") " pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:12 crc kubenswrapper[4711]: I1203 12:45:12.064066 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p276w\" (UniqueName: \"kubernetes.io/projected/b55ffe9c-6afa-4506-b385-46bc81594479-kube-api-access-p276w\") pod \"glance-operator-index-82qq5\" (UID: \"b55ffe9c-6afa-4506-b385-46bc81594479\") " pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:12 crc kubenswrapper[4711]: I1203 12:45:12.109542 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:12 crc kubenswrapper[4711]: I1203 12:45:12.555503 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-82qq5"] Dec 03 12:45:12 crc kubenswrapper[4711]: I1203 12:45:12.672099 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-82qq5" event={"ID":"b55ffe9c-6afa-4506-b385-46bc81594479","Type":"ContainerStarted","Data":"c8378a6a51180c3db7d3edc63558dc549b8bdabf419a16bd9a0717c2bed4e030"} Dec 03 12:45:14 crc kubenswrapper[4711]: I1203 12:45:14.693322 4711 generic.go:334] "Generic (PLEG): container finished" podID="4f07a4c6-d466-42b4-aa65-5f58fa554bbd" containerID="3317c5b43d06b3964ee9ebbe456e8c42e3338df817044715619ba464de4c4d44" exitCode=0 Dec 03 12:45:14 crc kubenswrapper[4711]: I1203 12:45:14.693405 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" event={"ID":"4f07a4c6-d466-42b4-aa65-5f58fa554bbd","Type":"ContainerDied","Data":"3317c5b43d06b3964ee9ebbe456e8c42e3338df817044715619ba464de4c4d44"} Dec 03 12:45:15 crc kubenswrapper[4711]: I1203 12:45:15.204512 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:45:15 crc kubenswrapper[4711]: I1203 12:45:15.216245 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8773367-0744-42d7-a9ee-8f508d9d9c97-etc-swift\") pod \"swift-storage-0\" (UID: \"d8773367-0744-42d7-a9ee-8f508d9d9c97\") " pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:45:15 crc kubenswrapper[4711]: I1203 12:45:15.248983 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.179537 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.319402 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5h47\" (UniqueName: \"kubernetes.io/projected/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-kube-api-access-k5h47\") pod \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.319478 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-dispersionconf\") pod \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.319524 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-ring-data-devices\") pod \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.319563 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-swiftconf\") pod \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.319643 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-scripts\") pod \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.319668 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-etc-swift\") pod \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\" (UID: \"4f07a4c6-d466-42b4-aa65-5f58fa554bbd\") " Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.320426 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4f07a4c6-d466-42b4-aa65-5f58fa554bbd" (UID: "4f07a4c6-d466-42b4-aa65-5f58fa554bbd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.320972 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4f07a4c6-d466-42b4-aa65-5f58fa554bbd" (UID: "4f07a4c6-d466-42b4-aa65-5f58fa554bbd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.325795 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-kube-api-access-k5h47" (OuterVolumeSpecName: "kube-api-access-k5h47") pod "4f07a4c6-d466-42b4-aa65-5f58fa554bbd" (UID: "4f07a4c6-d466-42b4-aa65-5f58fa554bbd"). InnerVolumeSpecName "kube-api-access-k5h47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.332723 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4f07a4c6-d466-42b4-aa65-5f58fa554bbd" (UID: "4f07a4c6-d466-42b4-aa65-5f58fa554bbd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.342671 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-scripts" (OuterVolumeSpecName: "scripts") pod "4f07a4c6-d466-42b4-aa65-5f58fa554bbd" (UID: "4f07a4c6-d466-42b4-aa65-5f58fa554bbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.343382 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4f07a4c6-d466-42b4-aa65-5f58fa554bbd" (UID: "4f07a4c6-d466-42b4-aa65-5f58fa554bbd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.421531 4711 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.421561 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.421572 4711 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.421582 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5h47\" (UniqueName: \"kubernetes.io/projected/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-kube-api-access-k5h47\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.421591 4711 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.421600 4711 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f07a4c6-d466-42b4-aa65-5f58fa554bbd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.662769 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.707098 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"9f7f38d9943add1ad6b99529e6b363d42e7d002996cd5083280992f8330074e3"} Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.708519 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-82qq5" event={"ID":"b55ffe9c-6afa-4506-b385-46bc81594479","Type":"ContainerStarted","Data":"7151a7c92c406ab310266baf2b5848641b148dcb1096162dcaf764d05bb3a2db"} Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.714728 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" event={"ID":"4f07a4c6-d466-42b4-aa65-5f58fa554bbd","Type":"ContainerDied","Data":"a0b4c2295b0f1a909f634d1458eed878ec0cf4bb8c9c7348db32e7e850bc354d"} Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.714767 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0b4c2295b0f1a909f634d1458eed878ec0cf4bb8c9c7348db32e7e850bc354d" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.714960 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-tcsg8" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.725779 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-82qq5" podStartSLOduration=2.062412425 podStartE2EDuration="5.725766197s" podCreationTimestamp="2025-12-03 12:45:11 +0000 UTC" firstStartedPulling="2025-12-03 12:45:12.585136628 +0000 UTC m=+1831.254387883" lastFinishedPulling="2025-12-03 12:45:16.2484904 +0000 UTC m=+1834.917741655" observedRunningTime="2025-12-03 12:45:16.725358486 +0000 UTC m=+1835.394609751" watchObservedRunningTime="2025-12-03 12:45:16.725766197 +0000 UTC m=+1835.395017452" Dec 03 12:45:16 crc kubenswrapper[4711]: I1203 12:45:16.817541 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:45:16 crc kubenswrapper[4711]: E1203 12:45:16.817755 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.349851 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.358029 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.359168 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a481af3-5cbd-47a8-a9b5-d8d73feab045-etc-swift\") pod \"swift-proxy-8cfd9857-v9jnw\" (UID: \"8a481af3-5cbd-47a8-a9b5-d8d73feab045\") " pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.399334 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.425831 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.730807 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"6de854a597d59a7e760c67ea08bd94e4a5b68767595f937215db286fe8d54eaf"} Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.730853 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"af4b9f279f9911ad3c8a4f01d73302ce8e2cf6cabc3eab8b0801b341a3443273"} Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.730865 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"d959fc2a5d127e63777323ca3dee377c60f043f8e8b6f32afa12b0d772262a9b"} Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.730876 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"8427ac8a149f3880f2d553dad3589cd950a74921c15d41b7f961b362b2f927d9"} Dec 03 12:45:18 crc kubenswrapper[4711]: I1203 12:45:18.870122 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw"] Dec 03 12:45:18 crc kubenswrapper[4711]: W1203 12:45:18.872926 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a481af3_5cbd_47a8_a9b5_d8d73feab045.slice/crio-cff16fa9ba47b72d9ef2c35e619adcca131f509265b60b5f355e56a5ea12f136 WatchSource:0}: Error finding container cff16fa9ba47b72d9ef2c35e619adcca131f509265b60b5f355e56a5ea12f136: Status 404 returned error can't find the container with id cff16fa9ba47b72d9ef2c35e619adcca131f509265b60b5f355e56a5ea12f136 Dec 03 12:45:19 crc kubenswrapper[4711]: I1203 12:45:19.762194 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" event={"ID":"8a481af3-5cbd-47a8-a9b5-d8d73feab045","Type":"ContainerStarted","Data":"6e7a592217ba5cc946ddacc048e22bfa8b834f5139b28aff4edc5d211187c636"} Dec 03 12:45:19 crc kubenswrapper[4711]: I1203 12:45:19.762550 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" event={"ID":"8a481af3-5cbd-47a8-a9b5-d8d73feab045","Type":"ContainerStarted","Data":"09a7832ff59eceb3412ebd5f878fc65f965df39ea596a69128752c5fbb70d6ff"} Dec 03 12:45:19 crc kubenswrapper[4711]: I1203 12:45:19.762565 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" event={"ID":"8a481af3-5cbd-47a8-a9b5-d8d73feab045","Type":"ContainerStarted","Data":"cff16fa9ba47b72d9ef2c35e619adcca131f509265b60b5f355e56a5ea12f136"} Dec 03 12:45:19 crc kubenswrapper[4711]: I1203 12:45:19.763090 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:19 crc kubenswrapper[4711]: I1203 12:45:19.763146 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:19 crc kubenswrapper[4711]: I1203 12:45:19.788113 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" podStartSLOduration=17.788089716 podStartE2EDuration="17.788089716s" podCreationTimestamp="2025-12-03 12:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:19.779292973 +0000 UTC m=+1838.448544228" watchObservedRunningTime="2025-12-03 12:45:19.788089716 +0000 UTC m=+1838.457340971" Dec 03 12:45:20 crc kubenswrapper[4711]: I1203 12:45:20.773150 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"05890f6b1d02a1f971052f2b1657c3c851f34c23267070156385e58f0fd1a8e6"} Dec 03 12:45:20 crc kubenswrapper[4711]: I1203 12:45:20.773569 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"9b0370ddda42b0c460d020cae7e9204e573304e26272449f1d6ae1b876e71e7f"} Dec 03 12:45:21 crc kubenswrapper[4711]: I1203 12:45:21.798857 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"f1ddd1afdb04792a9968c1064b1b556d6f09a64de00b14741bc063f74149c1ad"} Dec 03 12:45:21 crc kubenswrapper[4711]: I1203 12:45:21.799309 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"79a6092d0294dcb63156583403399f5dd1d3d2b60c3916aeac0367c62925f794"} Dec 03 12:45:22 crc kubenswrapper[4711]: I1203 12:45:22.109923 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:22 crc kubenswrapper[4711]: I1203 12:45:22.110037 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:22 crc kubenswrapper[4711]: I1203 12:45:22.137837 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:22 crc kubenswrapper[4711]: I1203 12:45:22.833871 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-82qq5" Dec 03 12:45:23 crc kubenswrapper[4711]: I1203 12:45:23.432381 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:23 crc kubenswrapper[4711]: I1203 12:45:23.826574 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"f58dd29ffc9b70ef01a2df1286ea64de4dd00ac84382217504144c074ea039d7"} Dec 03 12:45:23 crc kubenswrapper[4711]: I1203 12:45:23.826618 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"441879b522e24cca59f3a7f61f0fc450a3a6f923764d4ea728bbfbcf3ec149de"} Dec 03 12:45:24 crc kubenswrapper[4711]: I1203 12:45:24.843704 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"f5a2f4b8e3e62c9248502a5049114389c184889fd7485826d1f2cd77b7793058"} Dec 03 12:45:24 crc kubenswrapper[4711]: I1203 12:45:24.843742 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"841a5239b5f026b89c6ce5c188440b4965b770234fd75e50cfb308d5326659c0"} Dec 03 12:45:24 crc kubenswrapper[4711]: I1203 12:45:24.843751 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"46840a641b1dc425d87633881be0410ef3b1dc533905a3e19afa45c22da577a0"} Dec 03 12:45:24 crc kubenswrapper[4711]: I1203 12:45:24.843760 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"112a67a111ef5656c9e19fc43b440677c55643097f93109f9522ce7182151b94"} Dec 03 12:45:25 crc kubenswrapper[4711]: I1203 12:45:25.857686 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"d8773367-0744-42d7-a9ee-8f508d9d9c97","Type":"ContainerStarted","Data":"874c4ec6aa1398d2a94623b543507ecfb853d4971bfad78777cd56e852945040"} Dec 03 12:45:25 crc kubenswrapper[4711]: I1203 12:45:25.908235 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=21.153799137 podStartE2EDuration="27.908208988s" podCreationTimestamp="2025-12-03 12:44:58 +0000 UTC" firstStartedPulling="2025-12-03 12:45:16.647639047 +0000 UTC m=+1835.316890302" lastFinishedPulling="2025-12-03 12:45:23.402048898 +0000 UTC m=+1842.071300153" observedRunningTime="2025-12-03 12:45:25.902178262 +0000 UTC m=+1844.571429557" watchObservedRunningTime="2025-12-03 12:45:25.908208988 +0000 UTC m=+1844.577460253" Dec 03 12:45:26 crc kubenswrapper[4711]: I1203 12:45:26.568324 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m2tgt"] Dec 03 12:45:26 crc kubenswrapper[4711]: I1203 12:45:26.568592 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m2tgt" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="registry-server" containerID="cri-o://7a74767d388777a6decfa8a38990f847da29f2e2b32f26d593139432f63a3253" gracePeriod=2 Dec 03 12:45:26 crc kubenswrapper[4711]: I1203 12:45:26.867050 4711 generic.go:334] "Generic (PLEG): container finished" podID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerID="7a74767d388777a6decfa8a38990f847da29f2e2b32f26d593139432f63a3253" exitCode=0 Dec 03 12:45:26 crc kubenswrapper[4711]: I1203 12:45:26.868379 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2tgt" event={"ID":"800e0d85-0d20-4da5-83a4-e41af85ef6d3","Type":"ContainerDied","Data":"7a74767d388777a6decfa8a38990f847da29f2e2b32f26d593139432f63a3253"} Dec 03 12:45:26 crc kubenswrapper[4711]: I1203 12:45:26.996012 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.078928 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-catalog-content\") pod \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.078994 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-utilities\") pod \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.079072 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvblh\" (UniqueName: \"kubernetes.io/projected/800e0d85-0d20-4da5-83a4-e41af85ef6d3-kube-api-access-bvblh\") pod \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\" (UID: \"800e0d85-0d20-4da5-83a4-e41af85ef6d3\") " Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.079801 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-utilities" (OuterVolumeSpecName: "utilities") pod "800e0d85-0d20-4da5-83a4-e41af85ef6d3" (UID: "800e0d85-0d20-4da5-83a4-e41af85ef6d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.080261 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.100099 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/800e0d85-0d20-4da5-83a4-e41af85ef6d3-kube-api-access-bvblh" (OuterVolumeSpecName: "kube-api-access-bvblh") pod "800e0d85-0d20-4da5-83a4-e41af85ef6d3" (UID: "800e0d85-0d20-4da5-83a4-e41af85ef6d3"). InnerVolumeSpecName "kube-api-access-bvblh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.181505 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvblh\" (UniqueName: \"kubernetes.io/projected/800e0d85-0d20-4da5-83a4-e41af85ef6d3-kube-api-access-bvblh\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.206553 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "800e0d85-0d20-4da5-83a4-e41af85ef6d3" (UID: "800e0d85-0d20-4da5-83a4-e41af85ef6d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.283387 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/800e0d85-0d20-4da5-83a4-e41af85ef6d3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.417278 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz"] Dec 03 12:45:27 crc kubenswrapper[4711]: E1203 12:45:27.417549 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f07a4c6-d466-42b4-aa65-5f58fa554bbd" containerName="swift-ring-rebalance" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.417567 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f07a4c6-d466-42b4-aa65-5f58fa554bbd" containerName="swift-ring-rebalance" Dec 03 12:45:27 crc kubenswrapper[4711]: E1203 12:45:27.417601 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="extract-utilities" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.417611 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="extract-utilities" Dec 03 12:45:27 crc kubenswrapper[4711]: E1203 12:45:27.417621 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="extract-content" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.417628 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="extract-content" Dec 03 12:45:27 crc kubenswrapper[4711]: E1203 12:45:27.417639 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="registry-server" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.417644 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="registry-server" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.417769 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" containerName="registry-server" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.417788 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f07a4c6-d466-42b4-aa65-5f58fa554bbd" containerName="swift-ring-rebalance" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.418901 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.424282 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gw8hz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.434659 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz"] Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.486224 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-bundle\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.486393 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrwdb\" (UniqueName: \"kubernetes.io/projected/2b104ca1-f49a-4127-8a94-e74b0834307e-kube-api-access-lrwdb\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.486430 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-util\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.588189 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrwdb\" (UniqueName: \"kubernetes.io/projected/2b104ca1-f49a-4127-8a94-e74b0834307e-kube-api-access-lrwdb\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.588535 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-util\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.588587 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-bundle\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.589096 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-util\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.589152 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-bundle\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.609280 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrwdb\" (UniqueName: \"kubernetes.io/projected/2b104ca1-f49a-4127-8a94-e74b0834307e-kube-api-access-lrwdb\") pod \"75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.732589 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.879842 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2tgt" event={"ID":"800e0d85-0d20-4da5-83a4-e41af85ef6d3","Type":"ContainerDied","Data":"2b66dcc207fc4e7417e98bce723bdd3ccb8a7ca8ce020bed18206167fce39c22"} Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.879884 4711 scope.go:117] "RemoveContainer" containerID="7a74767d388777a6decfa8a38990f847da29f2e2b32f26d593139432f63a3253" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.880030 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2tgt" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.923527 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m2tgt"] Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.928798 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m2tgt"] Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.934199 4711 scope.go:117] "RemoveContainer" containerID="0befea12b3f29624b5fdb7e9ec2ea4f1ec9c21f2dc1eab8d98cc94df58e5b0de" Dec 03 12:45:27 crc kubenswrapper[4711]: I1203 12:45:27.975082 4711 scope.go:117] "RemoveContainer" containerID="1ad369ad1faf1d9165a3fd025d1d4492b453915bc2a396e582b4af9da2662382" Dec 03 12:45:28 crc kubenswrapper[4711]: W1203 12:45:28.023016 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b104ca1_f49a_4127_8a94_e74b0834307e.slice/crio-03784923dd2043db28daab6f4d0a4623e6a75bc3e9b48813dcea7d810f2e1868 WatchSource:0}: Error finding container 03784923dd2043db28daab6f4d0a4623e6a75bc3e9b48813dcea7d810f2e1868: Status 404 returned error can't find the container with id 03784923dd2043db28daab6f4d0a4623e6a75bc3e9b48813dcea7d810f2e1868 Dec 03 12:45:28 crc kubenswrapper[4711]: I1203 12:45:28.024490 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz"] Dec 03 12:45:28 crc kubenswrapper[4711]: I1203 12:45:28.432328 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-8cfd9857-v9jnw" Dec 03 12:45:28 crc kubenswrapper[4711]: I1203 12:45:28.892952 4711 generic.go:334] "Generic (PLEG): container finished" podID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerID="12220a281f875771df47904c563dfd3a8a8cb14a80d5a76ecf73a65caa984b73" exitCode=0 Dec 03 12:45:28 crc kubenswrapper[4711]: I1203 12:45:28.892998 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" event={"ID":"2b104ca1-f49a-4127-8a94-e74b0834307e","Type":"ContainerDied","Data":"12220a281f875771df47904c563dfd3a8a8cb14a80d5a76ecf73a65caa984b73"} Dec 03 12:45:28 crc kubenswrapper[4711]: I1203 12:45:28.893024 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" event={"ID":"2b104ca1-f49a-4127-8a94-e74b0834307e","Type":"ContainerStarted","Data":"03784923dd2043db28daab6f4d0a4623e6a75bc3e9b48813dcea7d810f2e1868"} Dec 03 12:45:29 crc kubenswrapper[4711]: I1203 12:45:29.817775 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:45:29 crc kubenswrapper[4711]: E1203 12:45:29.818408 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:45:29 crc kubenswrapper[4711]: I1203 12:45:29.825492 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="800e0d85-0d20-4da5-83a4-e41af85ef6d3" path="/var/lib/kubelet/pods/800e0d85-0d20-4da5-83a4-e41af85ef6d3/volumes" Dec 03 12:45:29 crc kubenswrapper[4711]: I1203 12:45:29.902754 4711 generic.go:334] "Generic (PLEG): container finished" podID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerID="70159d6a09afdf41b5efb7ad7582789d4d6b79cf551a1d8cf6510cfd5d5e35c3" exitCode=0 Dec 03 12:45:29 crc kubenswrapper[4711]: I1203 12:45:29.902800 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" event={"ID":"2b104ca1-f49a-4127-8a94-e74b0834307e","Type":"ContainerDied","Data":"70159d6a09afdf41b5efb7ad7582789d4d6b79cf551a1d8cf6510cfd5d5e35c3"} Dec 03 12:45:30 crc kubenswrapper[4711]: I1203 12:45:30.911056 4711 generic.go:334] "Generic (PLEG): container finished" podID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerID="0b49ca74d4e202dc786871fcc14161f885c04980c060cb1be13814730ed89371" exitCode=0 Dec 03 12:45:30 crc kubenswrapper[4711]: I1203 12:45:30.911142 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" event={"ID":"2b104ca1-f49a-4127-8a94-e74b0834307e","Type":"ContainerDied","Data":"0b49ca74d4e202dc786871fcc14161f885c04980c060cb1be13814730ed89371"} Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.208325 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.267390 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-util\") pod \"2b104ca1-f49a-4127-8a94-e74b0834307e\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.267502 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrwdb\" (UniqueName: \"kubernetes.io/projected/2b104ca1-f49a-4127-8a94-e74b0834307e-kube-api-access-lrwdb\") pod \"2b104ca1-f49a-4127-8a94-e74b0834307e\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.267613 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-bundle\") pod \"2b104ca1-f49a-4127-8a94-e74b0834307e\" (UID: \"2b104ca1-f49a-4127-8a94-e74b0834307e\") " Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.268739 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-bundle" (OuterVolumeSpecName: "bundle") pod "2b104ca1-f49a-4127-8a94-e74b0834307e" (UID: "2b104ca1-f49a-4127-8a94-e74b0834307e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.274545 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b104ca1-f49a-4127-8a94-e74b0834307e-kube-api-access-lrwdb" (OuterVolumeSpecName: "kube-api-access-lrwdb") pod "2b104ca1-f49a-4127-8a94-e74b0834307e" (UID: "2b104ca1-f49a-4127-8a94-e74b0834307e"). InnerVolumeSpecName "kube-api-access-lrwdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.283282 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-util" (OuterVolumeSpecName: "util") pod "2b104ca1-f49a-4127-8a94-e74b0834307e" (UID: "2b104ca1-f49a-4127-8a94-e74b0834307e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.369637 4711 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.369676 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrwdb\" (UniqueName: \"kubernetes.io/projected/2b104ca1-f49a-4127-8a94-e74b0834307e-kube-api-access-lrwdb\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.369686 4711 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b104ca1-f49a-4127-8a94-e74b0834307e-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.932030 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" event={"ID":"2b104ca1-f49a-4127-8a94-e74b0834307e","Type":"ContainerDied","Data":"03784923dd2043db28daab6f4d0a4623e6a75bc3e9b48813dcea7d810f2e1868"} Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.932087 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03784923dd2043db28daab6f4d0a4623e6a75bc3e9b48813dcea7d810f2e1868" Dec 03 12:45:32 crc kubenswrapper[4711]: I1203 12:45:32.932106 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.078234 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv"] Dec 03 12:45:43 crc kubenswrapper[4711]: E1203 12:45:43.078893 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerName="pull" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.078918 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerName="pull" Dec 03 12:45:43 crc kubenswrapper[4711]: E1203 12:45:43.078946 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerName="util" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.078954 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerName="util" Dec 03 12:45:43 crc kubenswrapper[4711]: E1203 12:45:43.078969 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerName="extract" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.078976 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerName="extract" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.079087 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b104ca1-f49a-4127-8a94-e74b0834307e" containerName="extract" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.079530 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.082025 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-gzksl" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.082822 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.103386 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv"] Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.141157 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptwx\" (UniqueName: \"kubernetes.io/projected/885ab9bd-dfa6-4245-a8ac-126a9911538a-kube-api-access-8ptwx\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.141239 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/885ab9bd-dfa6-4245-a8ac-126a9911538a-apiservice-cert\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.141459 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/885ab9bd-dfa6-4245-a8ac-126a9911538a-webhook-cert\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.242585 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/885ab9bd-dfa6-4245-a8ac-126a9911538a-webhook-cert\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.242676 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptwx\" (UniqueName: \"kubernetes.io/projected/885ab9bd-dfa6-4245-a8ac-126a9911538a-kube-api-access-8ptwx\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.242701 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/885ab9bd-dfa6-4245-a8ac-126a9911538a-apiservice-cert\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.248181 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/885ab9bd-dfa6-4245-a8ac-126a9911538a-apiservice-cert\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.248528 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/885ab9bd-dfa6-4245-a8ac-126a9911538a-webhook-cert\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.263702 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptwx\" (UniqueName: \"kubernetes.io/projected/885ab9bd-dfa6-4245-a8ac-126a9911538a-kube-api-access-8ptwx\") pod \"glance-operator-controller-manager-f5bd89d87-dh9xv\" (UID: \"885ab9bd-dfa6-4245-a8ac-126a9911538a\") " pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.400181 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:43 crc kubenswrapper[4711]: I1203 12:45:43.922486 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv"] Dec 03 12:45:44 crc kubenswrapper[4711]: I1203 12:45:44.014966 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" event={"ID":"885ab9bd-dfa6-4245-a8ac-126a9911538a","Type":"ContainerStarted","Data":"375ff87082034f5c999f1d81a9cd753392f67249181e97388b58433eeb3fcfa5"} Dec 03 12:45:44 crc kubenswrapper[4711]: I1203 12:45:44.817190 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:45:45 crc kubenswrapper[4711]: I1203 12:45:45.027626 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"bf5aaef3ea300423cb3fae0894d5ba5f74a380ce9c593a8a024e4a722f79eb6b"} Dec 03 12:45:47 crc kubenswrapper[4711]: I1203 12:45:47.043369 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" event={"ID":"885ab9bd-dfa6-4245-a8ac-126a9911538a","Type":"ContainerStarted","Data":"08eacc44bba63b8027cc92bcfa88900fd88da3770b4b263d8c757c1c63c5db4b"} Dec 03 12:45:47 crc kubenswrapper[4711]: I1203 12:45:47.043955 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:47 crc kubenswrapper[4711]: I1203 12:45:47.064653 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" podStartSLOduration=1.5382351490000001 podStartE2EDuration="4.064631216s" podCreationTimestamp="2025-12-03 12:45:43 +0000 UTC" firstStartedPulling="2025-12-03 12:45:43.931295674 +0000 UTC m=+1862.600546929" lastFinishedPulling="2025-12-03 12:45:46.457691751 +0000 UTC m=+1865.126942996" observedRunningTime="2025-12-03 12:45:47.059900007 +0000 UTC m=+1865.729151282" watchObservedRunningTime="2025-12-03 12:45:47.064631216 +0000 UTC m=+1865.733882471" Dec 03 12:45:53 crc kubenswrapper[4711]: I1203 12:45:53.405494 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-f5bd89d87-dh9xv" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.669399 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.670436 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.674037 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-nhgb9" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.674345 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.677461 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.681049 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.689266 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.744743 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0af24c58-d931-49f4-b886-67dd862d0170-openstack-config\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.744805 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/0af24c58-d931-49f4-b886-67dd862d0170-openstack-scripts\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.744837 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0af24c58-d931-49f4-b886-67dd862d0170-openstack-config-secret\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.745001 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbz8\" (UniqueName: \"kubernetes.io/projected/0af24c58-d931-49f4-b886-67dd862d0170-kube-api-access-xdbz8\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.846476 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0af24c58-d931-49f4-b886-67dd862d0170-openstack-config\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.846528 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/0af24c58-d931-49f4-b886-67dd862d0170-openstack-scripts\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.846549 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0af24c58-d931-49f4-b886-67dd862d0170-openstack-config-secret\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.846616 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbz8\" (UniqueName: \"kubernetes.io/projected/0af24c58-d931-49f4-b886-67dd862d0170-kube-api-access-xdbz8\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.848546 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/0af24c58-d931-49f4-b886-67dd862d0170-openstack-scripts\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.848562 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0af24c58-d931-49f4-b886-67dd862d0170-openstack-config\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.857507 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0af24c58-d931-49f4-b886-67dd862d0170-openstack-config-secret\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.864228 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbz8\" (UniqueName: \"kubernetes.io/projected/0af24c58-d931-49f4-b886-67dd862d0170-kube-api-access-xdbz8\") pod \"openstackclient\" (UID: \"0af24c58-d931-49f4-b886-67dd862d0170\") " pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:54 crc kubenswrapper[4711]: I1203 12:45:54.994107 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Dec 03 12:45:55 crc kubenswrapper[4711]: I1203 12:45:55.482976 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 03 12:45:56 crc kubenswrapper[4711]: I1203 12:45:56.112940 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"0af24c58-d931-49f4-b886-67dd862d0170","Type":"ContainerStarted","Data":"a5bcc8c95728dcd67433857f169ca0418914a7f2c20b140f08d4fe783411c50b"} Dec 03 12:45:56 crc kubenswrapper[4711]: I1203 12:45:56.913798 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-q2b5k"] Dec 03 12:45:56 crc kubenswrapper[4711]: I1203 12:45:56.915251 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:45:56 crc kubenswrapper[4711]: I1203 12:45:56.920310 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-q2b5k"] Dec 03 12:45:56 crc kubenswrapper[4711]: I1203 12:45:56.933311 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-3e42-account-create-update-jgzx7"] Dec 03 12:45:56 crc kubenswrapper[4711]: I1203 12:45:56.934586 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:45:56 crc kubenswrapper[4711]: I1203 12:45:56.936365 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 03 12:45:56 crc kubenswrapper[4711]: I1203 12:45:56.950050 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3e42-account-create-update-jgzx7"] Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.107034 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318f131d-3285-4143-bb7c-495038d90363-operator-scripts\") pod \"glance-db-create-q2b5k\" (UID: \"318f131d-3285-4143-bb7c-495038d90363\") " pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.107239 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh7bh\" (UniqueName: \"kubernetes.io/projected/784295a4-13fe-4196-adb7-a007bf0c487e-kube-api-access-wh7bh\") pod \"glance-3e42-account-create-update-jgzx7\" (UID: \"784295a4-13fe-4196-adb7-a007bf0c487e\") " pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.107307 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784295a4-13fe-4196-adb7-a007bf0c487e-operator-scripts\") pod \"glance-3e42-account-create-update-jgzx7\" (UID: \"784295a4-13fe-4196-adb7-a007bf0c487e\") " pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.107353 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvv7q\" (UniqueName: \"kubernetes.io/projected/318f131d-3285-4143-bb7c-495038d90363-kube-api-access-wvv7q\") pod \"glance-db-create-q2b5k\" (UID: \"318f131d-3285-4143-bb7c-495038d90363\") " pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.208826 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvv7q\" (UniqueName: \"kubernetes.io/projected/318f131d-3285-4143-bb7c-495038d90363-kube-api-access-wvv7q\") pod \"glance-db-create-q2b5k\" (UID: \"318f131d-3285-4143-bb7c-495038d90363\") " pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.209152 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318f131d-3285-4143-bb7c-495038d90363-operator-scripts\") pod \"glance-db-create-q2b5k\" (UID: \"318f131d-3285-4143-bb7c-495038d90363\") " pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.209181 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh7bh\" (UniqueName: \"kubernetes.io/projected/784295a4-13fe-4196-adb7-a007bf0c487e-kube-api-access-wh7bh\") pod \"glance-3e42-account-create-update-jgzx7\" (UID: \"784295a4-13fe-4196-adb7-a007bf0c487e\") " pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.209238 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784295a4-13fe-4196-adb7-a007bf0c487e-operator-scripts\") pod \"glance-3e42-account-create-update-jgzx7\" (UID: \"784295a4-13fe-4196-adb7-a007bf0c487e\") " pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.210053 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784295a4-13fe-4196-adb7-a007bf0c487e-operator-scripts\") pod \"glance-3e42-account-create-update-jgzx7\" (UID: \"784295a4-13fe-4196-adb7-a007bf0c487e\") " pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.211078 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318f131d-3285-4143-bb7c-495038d90363-operator-scripts\") pod \"glance-db-create-q2b5k\" (UID: \"318f131d-3285-4143-bb7c-495038d90363\") " pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.226570 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvv7q\" (UniqueName: \"kubernetes.io/projected/318f131d-3285-4143-bb7c-495038d90363-kube-api-access-wvv7q\") pod \"glance-db-create-q2b5k\" (UID: \"318f131d-3285-4143-bb7c-495038d90363\") " pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.226873 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh7bh\" (UniqueName: \"kubernetes.io/projected/784295a4-13fe-4196-adb7-a007bf0c487e-kube-api-access-wh7bh\") pod \"glance-3e42-account-create-update-jgzx7\" (UID: \"784295a4-13fe-4196-adb7-a007bf0c487e\") " pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.239818 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.251846 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.706151 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3e42-account-create-update-jgzx7"] Dec 03 12:45:57 crc kubenswrapper[4711]: W1203 12:45:57.714760 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784295a4_13fe_4196_adb7_a007bf0c487e.slice/crio-195a248902c820f0646d1329735f5d5b5a24581afc97d7e1ee9c227f88bba189 WatchSource:0}: Error finding container 195a248902c820f0646d1329735f5d5b5a24581afc97d7e1ee9c227f88bba189: Status 404 returned error can't find the container with id 195a248902c820f0646d1329735f5d5b5a24581afc97d7e1ee9c227f88bba189 Dec 03 12:45:57 crc kubenswrapper[4711]: I1203 12:45:57.750130 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-q2b5k"] Dec 03 12:45:57 crc kubenswrapper[4711]: W1203 12:45:57.757601 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod318f131d_3285_4143_bb7c_495038d90363.slice/crio-7c5f1599a836e3a77fb7557a92ba51aa47b83335b938a83f97878c9400e89372 WatchSource:0}: Error finding container 7c5f1599a836e3a77fb7557a92ba51aa47b83335b938a83f97878c9400e89372: Status 404 returned error can't find the container with id 7c5f1599a836e3a77fb7557a92ba51aa47b83335b938a83f97878c9400e89372 Dec 03 12:45:58 crc kubenswrapper[4711]: I1203 12:45:58.133499 4711 generic.go:334] "Generic (PLEG): container finished" podID="318f131d-3285-4143-bb7c-495038d90363" containerID="ee9a67439fed18de902c8202f3607e93d9973fe87e7a1974fd509c9c3288ace8" exitCode=0 Dec 03 12:45:58 crc kubenswrapper[4711]: I1203 12:45:58.133601 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-q2b5k" event={"ID":"318f131d-3285-4143-bb7c-495038d90363","Type":"ContainerDied","Data":"ee9a67439fed18de902c8202f3607e93d9973fe87e7a1974fd509c9c3288ace8"} Dec 03 12:45:58 crc kubenswrapper[4711]: I1203 12:45:58.133969 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-q2b5k" event={"ID":"318f131d-3285-4143-bb7c-495038d90363","Type":"ContainerStarted","Data":"7c5f1599a836e3a77fb7557a92ba51aa47b83335b938a83f97878c9400e89372"} Dec 03 12:45:58 crc kubenswrapper[4711]: I1203 12:45:58.137100 4711 generic.go:334] "Generic (PLEG): container finished" podID="784295a4-13fe-4196-adb7-a007bf0c487e" containerID="3d572da822364ebd58f03c0cc3751a0d65b14cbad06f812de1f0098afe72fa4f" exitCode=0 Dec 03 12:45:58 crc kubenswrapper[4711]: I1203 12:45:58.137172 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" event={"ID":"784295a4-13fe-4196-adb7-a007bf0c487e","Type":"ContainerDied","Data":"3d572da822364ebd58f03c0cc3751a0d65b14cbad06f812de1f0098afe72fa4f"} Dec 03 12:45:58 crc kubenswrapper[4711]: I1203 12:45:58.137253 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" event={"ID":"784295a4-13fe-4196-adb7-a007bf0c487e","Type":"ContainerStarted","Data":"195a248902c820f0646d1329735f5d5b5a24581afc97d7e1ee9c227f88bba189"} Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.397955 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.406216 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.413519 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvv7q\" (UniqueName: \"kubernetes.io/projected/318f131d-3285-4143-bb7c-495038d90363-kube-api-access-wvv7q\") pod \"318f131d-3285-4143-bb7c-495038d90363\" (UID: \"318f131d-3285-4143-bb7c-495038d90363\") " Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.413605 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318f131d-3285-4143-bb7c-495038d90363-operator-scripts\") pod \"318f131d-3285-4143-bb7c-495038d90363\" (UID: \"318f131d-3285-4143-bb7c-495038d90363\") " Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.417087 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/318f131d-3285-4143-bb7c-495038d90363-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "318f131d-3285-4143-bb7c-495038d90363" (UID: "318f131d-3285-4143-bb7c-495038d90363"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.445204 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318f131d-3285-4143-bb7c-495038d90363-kube-api-access-wvv7q" (OuterVolumeSpecName: "kube-api-access-wvv7q") pod "318f131d-3285-4143-bb7c-495038d90363" (UID: "318f131d-3285-4143-bb7c-495038d90363"). InnerVolumeSpecName "kube-api-access-wvv7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.516107 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784295a4-13fe-4196-adb7-a007bf0c487e-operator-scripts\") pod \"784295a4-13fe-4196-adb7-a007bf0c487e\" (UID: \"784295a4-13fe-4196-adb7-a007bf0c487e\") " Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.516538 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh7bh\" (UniqueName: \"kubernetes.io/projected/784295a4-13fe-4196-adb7-a007bf0c487e-kube-api-access-wh7bh\") pod \"784295a4-13fe-4196-adb7-a007bf0c487e\" (UID: \"784295a4-13fe-4196-adb7-a007bf0c487e\") " Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.516648 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784295a4-13fe-4196-adb7-a007bf0c487e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "784295a4-13fe-4196-adb7-a007bf0c487e" (UID: "784295a4-13fe-4196-adb7-a007bf0c487e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.517133 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvv7q\" (UniqueName: \"kubernetes.io/projected/318f131d-3285-4143-bb7c-495038d90363-kube-api-access-wvv7q\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.517151 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784295a4-13fe-4196-adb7-a007bf0c487e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.517167 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318f131d-3285-4143-bb7c-495038d90363-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.519600 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784295a4-13fe-4196-adb7-a007bf0c487e-kube-api-access-wh7bh" (OuterVolumeSpecName: "kube-api-access-wh7bh") pod "784295a4-13fe-4196-adb7-a007bf0c487e" (UID: "784295a4-13fe-4196-adb7-a007bf0c487e"). InnerVolumeSpecName "kube-api-access-wh7bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:03 crc kubenswrapper[4711]: I1203 12:46:03.618047 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh7bh\" (UniqueName: \"kubernetes.io/projected/784295a4-13fe-4196-adb7-a007bf0c487e-kube-api-access-wh7bh\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:04 crc kubenswrapper[4711]: I1203 12:46:04.177267 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"0af24c58-d931-49f4-b886-67dd862d0170","Type":"ContainerStarted","Data":"1976870aeb15b636693c42dd9812672839446558831c81fbb5ccee975b7466d7"} Dec 03 12:46:04 crc kubenswrapper[4711]: I1203 12:46:04.179290 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" event={"ID":"784295a4-13fe-4196-adb7-a007bf0c487e","Type":"ContainerDied","Data":"195a248902c820f0646d1329735f5d5b5a24581afc97d7e1ee9c227f88bba189"} Dec 03 12:46:04 crc kubenswrapper[4711]: I1203 12:46:04.179366 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="195a248902c820f0646d1329735f5d5b5a24581afc97d7e1ee9c227f88bba189" Dec 03 12:46:04 crc kubenswrapper[4711]: I1203 12:46:04.179372 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3e42-account-create-update-jgzx7" Dec 03 12:46:04 crc kubenswrapper[4711]: I1203 12:46:04.181738 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-q2b5k" event={"ID":"318f131d-3285-4143-bb7c-495038d90363","Type":"ContainerDied","Data":"7c5f1599a836e3a77fb7557a92ba51aa47b83335b938a83f97878c9400e89372"} Dec 03 12:46:04 crc kubenswrapper[4711]: I1203 12:46:04.182044 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c5f1599a836e3a77fb7557a92ba51aa47b83335b938a83f97878c9400e89372" Dec 03 12:46:04 crc kubenswrapper[4711]: I1203 12:46:04.181855 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-q2b5k" Dec 03 12:46:04 crc kubenswrapper[4711]: I1203 12:46:04.197774 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.292034575 podStartE2EDuration="10.197756885s" podCreationTimestamp="2025-12-03 12:45:54 +0000 UTC" firstStartedPulling="2025-12-03 12:45:55.487656245 +0000 UTC m=+1874.156907500" lastFinishedPulling="2025-12-03 12:46:03.393378555 +0000 UTC m=+1882.062629810" observedRunningTime="2025-12-03 12:46:04.19506066 +0000 UTC m=+1882.864311925" watchObservedRunningTime="2025-12-03 12:46:04.197756885 +0000 UTC m=+1882.867008140" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.256419 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-lw7wx"] Dec 03 12:46:07 crc kubenswrapper[4711]: E1203 12:46:07.256976 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318f131d-3285-4143-bb7c-495038d90363" containerName="mariadb-database-create" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.256988 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="318f131d-3285-4143-bb7c-495038d90363" containerName="mariadb-database-create" Dec 03 12:46:07 crc kubenswrapper[4711]: E1203 12:46:07.257012 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784295a4-13fe-4196-adb7-a007bf0c487e" containerName="mariadb-account-create-update" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.257019 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="784295a4-13fe-4196-adb7-a007bf0c487e" containerName="mariadb-account-create-update" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.257156 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="784295a4-13fe-4196-adb7-a007bf0c487e" containerName="mariadb-account-create-update" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.257175 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="318f131d-3285-4143-bb7c-495038d90363" containerName="mariadb-database-create" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.257585 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.259697 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-nlh2p" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.260807 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.268174 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lw7wx"] Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.269892 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-db-sync-config-data\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.269991 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-config-data\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.270029 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hjh9\" (UniqueName: \"kubernetes.io/projected/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-kube-api-access-4hjh9\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.371330 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-db-sync-config-data\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.371399 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-config-data\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.371430 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hjh9\" (UniqueName: \"kubernetes.io/projected/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-kube-api-access-4hjh9\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.376633 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-db-sync-config-data\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.376853 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-config-data\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.404486 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hjh9\" (UniqueName: \"kubernetes.io/projected/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-kube-api-access-4hjh9\") pod \"glance-db-sync-lw7wx\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:07 crc kubenswrapper[4711]: I1203 12:46:07.574976 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:08 crc kubenswrapper[4711]: I1203 12:46:08.062090 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lw7wx"] Dec 03 12:46:08 crc kubenswrapper[4711]: W1203 12:46:08.065539 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb93a523c_70d8_4e6a_a2f8_84eb85ea64d6.slice/crio-297a8551dfddab15329806556fbac92f648d91b3afe09f8e3dea8f17a2273486 WatchSource:0}: Error finding container 297a8551dfddab15329806556fbac92f648d91b3afe09f8e3dea8f17a2273486: Status 404 returned error can't find the container with id 297a8551dfddab15329806556fbac92f648d91b3afe09f8e3dea8f17a2273486 Dec 03 12:46:08 crc kubenswrapper[4711]: I1203 12:46:08.207915 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lw7wx" event={"ID":"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6","Type":"ContainerStarted","Data":"297a8551dfddab15329806556fbac92f648d91b3afe09f8e3dea8f17a2273486"} Dec 03 12:46:21 crc kubenswrapper[4711]: I1203 12:46:21.314451 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lw7wx" event={"ID":"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6","Type":"ContainerStarted","Data":"8dfd6b14f1478b539768c44ce6e52d71d7a2141500806e78386bbbae36504b00"} Dec 03 12:46:21 crc kubenswrapper[4711]: I1203 12:46:21.334840 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-lw7wx" podStartSLOduration=1.966372848 podStartE2EDuration="14.334824283s" podCreationTimestamp="2025-12-03 12:46:07 +0000 UTC" firstStartedPulling="2025-12-03 12:46:08.067878108 +0000 UTC m=+1886.737129363" lastFinishedPulling="2025-12-03 12:46:20.436329503 +0000 UTC m=+1899.105580798" observedRunningTime="2025-12-03 12:46:21.330931546 +0000 UTC m=+1900.000182821" watchObservedRunningTime="2025-12-03 12:46:21.334824283 +0000 UTC m=+1900.004075538" Dec 03 12:46:37 crc kubenswrapper[4711]: I1203 12:46:37.441162 4711 generic.go:334] "Generic (PLEG): container finished" podID="b93a523c-70d8-4e6a-a2f8-84eb85ea64d6" containerID="8dfd6b14f1478b539768c44ce6e52d71d7a2141500806e78386bbbae36504b00" exitCode=0 Dec 03 12:46:37 crc kubenswrapper[4711]: I1203 12:46:37.441261 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lw7wx" event={"ID":"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6","Type":"ContainerDied","Data":"8dfd6b14f1478b539768c44ce6e52d71d7a2141500806e78386bbbae36504b00"} Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.707650 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.784083 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hjh9\" (UniqueName: \"kubernetes.io/projected/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-kube-api-access-4hjh9\") pod \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.784185 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-config-data\") pod \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.784261 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-db-sync-config-data\") pod \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\" (UID: \"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6\") " Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.790191 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b93a523c-70d8-4e6a-a2f8-84eb85ea64d6" (UID: "b93a523c-70d8-4e6a-a2f8-84eb85ea64d6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.791112 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-kube-api-access-4hjh9" (OuterVolumeSpecName: "kube-api-access-4hjh9") pod "b93a523c-70d8-4e6a-a2f8-84eb85ea64d6" (UID: "b93a523c-70d8-4e6a-a2f8-84eb85ea64d6"). InnerVolumeSpecName "kube-api-access-4hjh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.818883 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-config-data" (OuterVolumeSpecName: "config-data") pod "b93a523c-70d8-4e6a-a2f8-84eb85ea64d6" (UID: "b93a523c-70d8-4e6a-a2f8-84eb85ea64d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.885430 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hjh9\" (UniqueName: \"kubernetes.io/projected/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-kube-api-access-4hjh9\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.885726 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:38 crc kubenswrapper[4711]: I1203 12:46:38.885737 4711 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:39 crc kubenswrapper[4711]: I1203 12:46:39.454598 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lw7wx" event={"ID":"b93a523c-70d8-4e6a-a2f8-84eb85ea64d6","Type":"ContainerDied","Data":"297a8551dfddab15329806556fbac92f648d91b3afe09f8e3dea8f17a2273486"} Dec 03 12:46:39 crc kubenswrapper[4711]: I1203 12:46:39.454636 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="297a8551dfddab15329806556fbac92f648d91b3afe09f8e3dea8f17a2273486" Dec 03 12:46:39 crc kubenswrapper[4711]: I1203 12:46:39.454678 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lw7wx" Dec 03 12:46:40 crc kubenswrapper[4711]: I1203 12:46:40.932116 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:46:40 crc kubenswrapper[4711]: E1203 12:46:40.932610 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93a523c-70d8-4e6a-a2f8-84eb85ea64d6" containerName="glance-db-sync" Dec 03 12:46:40 crc kubenswrapper[4711]: I1203 12:46:40.932632 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93a523c-70d8-4e6a-a2f8-84eb85ea64d6" containerName="glance-db-sync" Dec 03 12:46:40 crc kubenswrapper[4711]: I1203 12:46:40.932797 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93a523c-70d8-4e6a-a2f8-84eb85ea64d6" containerName="glance-db-sync" Dec 03 12:46:40 crc kubenswrapper[4711]: I1203 12:46:40.933688 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:40 crc kubenswrapper[4711]: I1203 12:46:40.935511 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-nlh2p" Dec 03 12:46:40 crc kubenswrapper[4711]: I1203 12:46:40.936135 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 03 12:46:40 crc kubenswrapper[4711]: I1203 12:46:40.938332 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 03 12:46:40 crc kubenswrapper[4711]: I1203 12:46:40.950417 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.000159 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.001505 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.011892 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.019815 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.020237 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.020352 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.020476 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-dev\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.020615 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-config-data\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.020733 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-logs\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.020823 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kdk8\" (UniqueName: \"kubernetes.io/projected/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-kube-api-access-6kdk8\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.020972 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-httpd-run\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.021097 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-sys\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.021239 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.021288 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-lib-modules\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.021304 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-run\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.021319 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-nvme\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.021434 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-scripts\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.122296 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-lib-modules\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.122580 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-run\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.122686 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-nvme\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.122802 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk52j\" (UniqueName: \"kubernetes.io/projected/470f7e34-0505-4450-9caa-1a9199ff2227-kube-api-access-hk52j\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.122425 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-lib-modules\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.122824 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-nvme\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.122607 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-run\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123307 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123364 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-scripts\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123400 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-scripts\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123464 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123523 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123573 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-logs\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123595 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-config-data\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123629 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123646 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123708 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-run\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123723 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-httpd-run\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123750 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-dev\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123790 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-config-data\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123834 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-logs\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123852 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123869 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kdk8\" (UniqueName: \"kubernetes.io/projected/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-kube-api-access-6kdk8\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123900 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-lib-modules\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123964 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-sys\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.123992 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-dev\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124014 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124039 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-httpd-run\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124065 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-nvme\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124086 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124101 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-sys\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124198 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-sys\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124229 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-dev\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124261 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124490 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.124600 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.125084 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-logs\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.125256 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.125370 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-httpd-run\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.129676 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-scripts\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.131521 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-config-data\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.141634 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kdk8\" (UniqueName: \"kubernetes.io/projected/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-kube-api-access-6kdk8\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.142672 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.147190 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-1\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225373 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-httpd-run\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225421 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-run\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225455 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225476 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-lib-modules\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225497 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-sys\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225514 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-dev\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225537 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225560 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-nvme\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225585 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk52j\" (UniqueName: \"kubernetes.io/projected/470f7e34-0505-4450-9caa-1a9199ff2227-kube-api-access-hk52j\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225612 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225630 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-scripts\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225651 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225676 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-logs\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.225693 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-config-data\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226260 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226346 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-sys\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226302 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226259 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226371 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226614 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-logs\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226668 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-dev\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226718 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-nvme\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226734 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-httpd-run\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226791 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-run\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.226326 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-lib-modules\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.229822 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-config-data\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.230062 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-scripts\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.242633 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk52j\" (UniqueName: \"kubernetes.io/projected/470f7e34-0505-4450-9caa-1a9199ff2227-kube-api-access-hk52j\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.244612 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.246698 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.253876 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.316513 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.573663 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:46:41 crc kubenswrapper[4711]: I1203 12:46:41.687292 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:46:41 crc kubenswrapper[4711]: W1203 12:46:41.694599 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fdf7796_6e30_4ac7_9bcd_6ca3a398d2d4.slice/crio-9fdf7af2325e174ef4e5fe913d19297e2213d677a702c122323bea1c6fa21097 WatchSource:0}: Error finding container 9fdf7af2325e174ef4e5fe913d19297e2213d677a702c122323bea1c6fa21097: Status 404 returned error can't find the container with id 9fdf7af2325e174ef4e5fe913d19297e2213d677a702c122323bea1c6fa21097 Dec 03 12:46:42 crc kubenswrapper[4711]: I1203 12:46:42.490822 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4","Type":"ContainerStarted","Data":"4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba"} Dec 03 12:46:42 crc kubenswrapper[4711]: I1203 12:46:42.491522 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4","Type":"ContainerStarted","Data":"d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955"} Dec 03 12:46:42 crc kubenswrapper[4711]: I1203 12:46:42.491544 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4","Type":"ContainerStarted","Data":"9fdf7af2325e174ef4e5fe913d19297e2213d677a702c122323bea1c6fa21097"} Dec 03 12:46:42 crc kubenswrapper[4711]: I1203 12:46:42.494156 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"470f7e34-0505-4450-9caa-1a9199ff2227","Type":"ContainerStarted","Data":"3cc6e924d7a676d03eba3ac7311cb6da0031f6b9116c8cc59fd8145f2612d299"} Dec 03 12:46:42 crc kubenswrapper[4711]: I1203 12:46:42.494234 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"470f7e34-0505-4450-9caa-1a9199ff2227","Type":"ContainerStarted","Data":"3f1a8d0248466108933d64d881cef8d44d5586d774770c2da4a63defabeb5f0c"} Dec 03 12:46:42 crc kubenswrapper[4711]: I1203 12:46:42.494262 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"470f7e34-0505-4450-9caa-1a9199ff2227","Type":"ContainerStarted","Data":"160df1e0710c0c5d924b5d02b0dba0b6bc849e579051fe533e8f5050289255a7"} Dec 03 12:46:42 crc kubenswrapper[4711]: I1203 12:46:42.569994 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.5698627050000002 podStartE2EDuration="3.569862705s" podCreationTimestamp="2025-12-03 12:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:42.555698495 +0000 UTC m=+1921.224949830" watchObservedRunningTime="2025-12-03 12:46:42.569862705 +0000 UTC m=+1921.239113980" Dec 03 12:46:42 crc kubenswrapper[4711]: I1203 12:46:42.573129 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.573112414 podStartE2EDuration="2.573112414s" podCreationTimestamp="2025-12-03 12:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:42.518551223 +0000 UTC m=+1921.187802488" watchObservedRunningTime="2025-12-03 12:46:42.573112414 +0000 UTC m=+1921.242363699" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.254730 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.255383 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.283962 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.297845 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.317108 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.317522 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.359580 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.363411 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.560883 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.561460 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.561476 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:51 crc kubenswrapper[4711]: I1203 12:46:51.561489 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:53 crc kubenswrapper[4711]: I1203 12:46:53.584998 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:46:53 crc kubenswrapper[4711]: I1203 12:46:53.585392 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:46:55 crc kubenswrapper[4711]: I1203 12:46:55.519561 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:55 crc kubenswrapper[4711]: I1203 12:46:55.520163 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:46:55 crc kubenswrapper[4711]: I1203 12:46:55.523127 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:46:55 crc kubenswrapper[4711]: I1203 12:46:55.531006 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:55 crc kubenswrapper[4711]: I1203 12:46:55.531108 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:46:55 crc kubenswrapper[4711]: I1203 12:46:55.580069 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:46:55 crc kubenswrapper[4711]: I1203 12:46:55.681196 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:46:56 crc kubenswrapper[4711]: I1203 12:46:56.611091 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-log" containerID="cri-o://3f1a8d0248466108933d64d881cef8d44d5586d774770c2da4a63defabeb5f0c" gracePeriod=30 Dec 03 12:46:56 crc kubenswrapper[4711]: I1203 12:46:56.611879 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-httpd" containerID="cri-o://3cc6e924d7a676d03eba3ac7311cb6da0031f6b9116c8cc59fd8145f2612d299" gracePeriod=30 Dec 03 12:46:56 crc kubenswrapper[4711]: I1203 12:46:56.616924 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.103:9292/healthcheck\": EOF" Dec 03 12:46:56 crc kubenswrapper[4711]: E1203 12:46:56.751070 4711 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod470f7e34_0505_4450_9caa_1a9199ff2227.slice/crio-conmon-3f1a8d0248466108933d64d881cef8d44d5586d774770c2da4a63defabeb5f0c.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:46:57 crc kubenswrapper[4711]: I1203 12:46:57.635158 4711 generic.go:334] "Generic (PLEG): container finished" podID="470f7e34-0505-4450-9caa-1a9199ff2227" containerID="3f1a8d0248466108933d64d881cef8d44d5586d774770c2da4a63defabeb5f0c" exitCode=143 Dec 03 12:46:57 crc kubenswrapper[4711]: I1203 12:46:57.635330 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"470f7e34-0505-4450-9caa-1a9199ff2227","Type":"ContainerDied","Data":"3f1a8d0248466108933d64d881cef8d44d5586d774770c2da4a63defabeb5f0c"} Dec 03 12:47:01 crc kubenswrapper[4711]: I1203 12:47:01.664032 4711 generic.go:334] "Generic (PLEG): container finished" podID="470f7e34-0505-4450-9caa-1a9199ff2227" containerID="3cc6e924d7a676d03eba3ac7311cb6da0031f6b9116c8cc59fd8145f2612d299" exitCode=0 Dec 03 12:47:01 crc kubenswrapper[4711]: I1203 12:47:01.664278 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"470f7e34-0505-4450-9caa-1a9199ff2227","Type":"ContainerDied","Data":"3cc6e924d7a676d03eba3ac7311cb6da0031f6b9116c8cc59fd8145f2612d299"} Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.109530 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.149702 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-run\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.149793 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-var-locks-brick\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.149888 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-lib-modules\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.149981 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk52j\" (UniqueName: \"kubernetes.io/projected/470f7e34-0505-4450-9caa-1a9199ff2227-kube-api-access-hk52j\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.150034 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-scripts\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.150125 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-logs\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.150139 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-run" (OuterVolumeSpecName: "run") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.150166 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.150606 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.150632 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.151224 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-logs" (OuterVolumeSpecName: "logs") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.151283 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.159950 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470f7e34-0505-4450-9caa-1a9199ff2227-kube-api-access-hk52j" (OuterVolumeSpecName: "kube-api-access-hk52j") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "kube-api-access-hk52j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.159989 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-scripts" (OuterVolumeSpecName: "scripts") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251710 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-nvme\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251779 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-dev\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251811 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251861 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-iscsi\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251888 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-sys\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251881 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-dev" (OuterVolumeSpecName: "dev") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251959 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-httpd-run\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251964 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252005 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-config-data\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252058 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"470f7e34-0505-4450-9caa-1a9199ff2227\" (UID: \"470f7e34-0505-4450-9caa-1a9199ff2227\") " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.251991 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-sys" (OuterVolumeSpecName: "sys") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252547 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252574 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk52j\" (UniqueName: \"kubernetes.io/projected/470f7e34-0505-4450-9caa-1a9199ff2227-kube-api-access-hk52j\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252593 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252611 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252625 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252640 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.252655 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.253004 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.253145 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.255195 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.255653 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.296803 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-config-data" (OuterVolumeSpecName: "config-data") pod "470f7e34-0505-4450-9caa-1a9199ff2227" (UID: "470f7e34-0505-4450-9caa-1a9199ff2227"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.353506 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.353552 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/470f7e34-0505-4450-9caa-1a9199ff2227-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.353565 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/470f7e34-0505-4450-9caa-1a9199ff2227-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.353577 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470f7e34-0505-4450-9caa-1a9199ff2227-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.353598 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.370086 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.374364 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.455366 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.455395 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.684669 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"470f7e34-0505-4450-9caa-1a9199ff2227","Type":"ContainerDied","Data":"160df1e0710c0c5d924b5d02b0dba0b6bc849e579051fe533e8f5050289255a7"} Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.684746 4711 scope.go:117] "RemoveContainer" containerID="3cc6e924d7a676d03eba3ac7311cb6da0031f6b9116c8cc59fd8145f2612d299" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.684944 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.746109 4711 scope.go:117] "RemoveContainer" containerID="3f1a8d0248466108933d64d881cef8d44d5586d774770c2da4a63defabeb5f0c" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.750779 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.813509 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.869366 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:02 crc kubenswrapper[4711]: E1203 12:47:02.869690 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-httpd" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.869709 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-httpd" Dec 03 12:47:02 crc kubenswrapper[4711]: E1203 12:47:02.869739 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-log" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.869747 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-log" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.869915 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-log" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.869930 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" containerName="glance-httpd" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.870673 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:02 crc kubenswrapper[4711]: I1203 12:47:02.878206 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.071680 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-config-data\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.071763 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.071796 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-run\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.071819 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-httpd-run\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.071925 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-sys\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.071963 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-scripts\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.071988 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4jv\" (UniqueName: \"kubernetes.io/projected/541bebd5-0485-4874-98e4-39ebf4267705-kube-api-access-df4jv\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.072030 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-lib-modules\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.072218 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.072280 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-logs\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.072349 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-nvme\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.072378 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.072499 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.072618 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-dev\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.174719 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-config-data\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.174792 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.174835 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-run\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.174864 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-httpd-run\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.174957 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.174969 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-run\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175443 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-httpd-run\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175622 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-sys\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175656 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-scripts\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175678 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4jv\" (UniqueName: \"kubernetes.io/projected/541bebd5-0485-4874-98e4-39ebf4267705-kube-api-access-df4jv\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175752 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-lib-modules\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175788 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175843 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-logs\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175853 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-lib-modules\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175868 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-nvme\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175975 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.176007 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.176026 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-nvme\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.176077 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-dev\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.175659 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-sys\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.176120 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.176160 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.176168 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.176190 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-dev\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.176489 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-logs\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.180743 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-scripts\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.199077 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-config-data\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.201710 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4jv\" (UniqueName: \"kubernetes.io/projected/541bebd5-0485-4874-98e4-39ebf4267705-kube-api-access-df4jv\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.202090 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.206221 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.498036 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.826239 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470f7e34-0505-4450-9caa-1a9199ff2227" path="/var/lib/kubelet/pods/470f7e34-0505-4450-9caa-1a9199ff2227/volumes" Dec 03 12:47:03 crc kubenswrapper[4711]: I1203 12:47:03.919528 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:03 crc kubenswrapper[4711]: W1203 12:47:03.923993 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod541bebd5_0485_4874_98e4_39ebf4267705.slice/crio-e5fa304fa020bd9569fdf63137a693e80600fadc39c2bf0d9b02a4967b763f64 WatchSource:0}: Error finding container e5fa304fa020bd9569fdf63137a693e80600fadc39c2bf0d9b02a4967b763f64: Status 404 returned error can't find the container with id e5fa304fa020bd9569fdf63137a693e80600fadc39c2bf0d9b02a4967b763f64 Dec 03 12:47:04 crc kubenswrapper[4711]: I1203 12:47:04.706562 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"541bebd5-0485-4874-98e4-39ebf4267705","Type":"ContainerStarted","Data":"118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683"} Dec 03 12:47:04 crc kubenswrapper[4711]: I1203 12:47:04.707088 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"541bebd5-0485-4874-98e4-39ebf4267705","Type":"ContainerStarted","Data":"e5fa304fa020bd9569fdf63137a693e80600fadc39c2bf0d9b02a4967b763f64"} Dec 03 12:47:06 crc kubenswrapper[4711]: I1203 12:47:06.720589 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"541bebd5-0485-4874-98e4-39ebf4267705","Type":"ContainerStarted","Data":"490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717"} Dec 03 12:47:06 crc kubenswrapper[4711]: I1203 12:47:06.751564 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=4.751546678 podStartE2EDuration="4.751546678s" podCreationTimestamp="2025-12-03 12:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:06.74652497 +0000 UTC m=+1945.415776245" watchObservedRunningTime="2025-12-03 12:47:06.751546678 +0000 UTC m=+1945.420797933" Dec 03 12:47:13 crc kubenswrapper[4711]: I1203 12:47:13.499134 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:13 crc kubenswrapper[4711]: I1203 12:47:13.499741 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:13 crc kubenswrapper[4711]: I1203 12:47:13.529046 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:13 crc kubenswrapper[4711]: I1203 12:47:13.548675 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:13 crc kubenswrapper[4711]: I1203 12:47:13.775690 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:13 crc kubenswrapper[4711]: I1203 12:47:13.775757 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:15 crc kubenswrapper[4711]: I1203 12:47:15.872894 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:15 crc kubenswrapper[4711]: I1203 12:47:15.873274 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:47:15 crc kubenswrapper[4711]: I1203 12:47:15.879730 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.647943 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lw7wx"] Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.656814 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lw7wx"] Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.755092 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-2gmg2"] Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.756091 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.758121 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.758723 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.776079 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-2gmg2"] Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.837436 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b93a523c-70d8-4e6a-a2f8-84eb85ea64d6" path="/var/lib/kubelet/pods/b93a523c-70d8-4e6a-a2f8-84eb85ea64d6/volumes" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.874684 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-config-data\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.874733 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-combined-ca-bundle\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.874762 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f9rt\" (UniqueName: \"kubernetes.io/projected/11699312-ea2a-42c3-8614-1593b6275d91-kube-api-access-8f9rt\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.875656 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-db-sync-config-data\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.977083 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f9rt\" (UniqueName: \"kubernetes.io/projected/11699312-ea2a-42c3-8614-1593b6275d91-kube-api-access-8f9rt\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.977268 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-db-sync-config-data\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.977326 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-config-data\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.977358 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-combined-ca-bundle\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.982625 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-combined-ca-bundle\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.983032 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-db-sync-config-data\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.984497 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-config-data\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:27 crc kubenswrapper[4711]: I1203 12:47:27.992732 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f9rt\" (UniqueName: \"kubernetes.io/projected/11699312-ea2a-42c3-8614-1593b6275d91-kube-api-access-8f9rt\") pod \"glance-db-sync-2gmg2\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:28 crc kubenswrapper[4711]: I1203 12:47:28.137463 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:28 crc kubenswrapper[4711]: I1203 12:47:28.534840 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-2gmg2"] Dec 03 12:47:28 crc kubenswrapper[4711]: W1203 12:47:28.543729 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11699312_ea2a_42c3_8614_1593b6275d91.slice/crio-fb22a2d1bd0f8f5d12298492e7d23e38ba4b38101d72a4dee76e1b07e8726183 WatchSource:0}: Error finding container fb22a2d1bd0f8f5d12298492e7d23e38ba4b38101d72a4dee76e1b07e8726183: Status 404 returned error can't find the container with id fb22a2d1bd0f8f5d12298492e7d23e38ba4b38101d72a4dee76e1b07e8726183 Dec 03 12:47:28 crc kubenswrapper[4711]: I1203 12:47:28.909251 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-2gmg2" event={"ID":"11699312-ea2a-42c3-8614-1593b6275d91","Type":"ContainerStarted","Data":"fb22a2d1bd0f8f5d12298492e7d23e38ba4b38101d72a4dee76e1b07e8726183"} Dec 03 12:47:29 crc kubenswrapper[4711]: I1203 12:47:29.922469 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-2gmg2" event={"ID":"11699312-ea2a-42c3-8614-1593b6275d91","Type":"ContainerStarted","Data":"0aa6cfcc7cb1bcc1eddab83e201e348e86f7e7300c7dda744d07e6a23a153a2e"} Dec 03 12:47:29 crc kubenswrapper[4711]: I1203 12:47:29.945984 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-2gmg2" podStartSLOduration=2.945961357 podStartE2EDuration="2.945961357s" podCreationTimestamp="2025-12-03 12:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:29.942153702 +0000 UTC m=+1968.611404987" watchObservedRunningTime="2025-12-03 12:47:29.945961357 +0000 UTC m=+1968.615212622" Dec 03 12:47:32 crc kubenswrapper[4711]: I1203 12:47:32.961271 4711 generic.go:334] "Generic (PLEG): container finished" podID="11699312-ea2a-42c3-8614-1593b6275d91" containerID="0aa6cfcc7cb1bcc1eddab83e201e348e86f7e7300c7dda744d07e6a23a153a2e" exitCode=0 Dec 03 12:47:32 crc kubenswrapper[4711]: I1203 12:47:32.961381 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-2gmg2" event={"ID":"11699312-ea2a-42c3-8614-1593b6275d91","Type":"ContainerDied","Data":"0aa6cfcc7cb1bcc1eddab83e201e348e86f7e7300c7dda744d07e6a23a153a2e"} Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.249581 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.381977 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f9rt\" (UniqueName: \"kubernetes.io/projected/11699312-ea2a-42c3-8614-1593b6275d91-kube-api-access-8f9rt\") pod \"11699312-ea2a-42c3-8614-1593b6275d91\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.382021 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-combined-ca-bundle\") pod \"11699312-ea2a-42c3-8614-1593b6275d91\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.382145 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-db-sync-config-data\") pod \"11699312-ea2a-42c3-8614-1593b6275d91\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.382171 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-config-data\") pod \"11699312-ea2a-42c3-8614-1593b6275d91\" (UID: \"11699312-ea2a-42c3-8614-1593b6275d91\") " Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.389815 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11699312-ea2a-42c3-8614-1593b6275d91-kube-api-access-8f9rt" (OuterVolumeSpecName: "kube-api-access-8f9rt") pod "11699312-ea2a-42c3-8614-1593b6275d91" (UID: "11699312-ea2a-42c3-8614-1593b6275d91"). InnerVolumeSpecName "kube-api-access-8f9rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.389929 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "11699312-ea2a-42c3-8614-1593b6275d91" (UID: "11699312-ea2a-42c3-8614-1593b6275d91"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.403451 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11699312-ea2a-42c3-8614-1593b6275d91" (UID: "11699312-ea2a-42c3-8614-1593b6275d91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.424388 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-config-data" (OuterVolumeSpecName: "config-data") pod "11699312-ea2a-42c3-8614-1593b6275d91" (UID: "11699312-ea2a-42c3-8614-1593b6275d91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.484421 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f9rt\" (UniqueName: \"kubernetes.io/projected/11699312-ea2a-42c3-8614-1593b6275d91-kube-api-access-8f9rt\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.484473 4711 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.484485 4711 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.484497 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11699312-ea2a-42c3-8614-1593b6275d91-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.983794 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-2gmg2" event={"ID":"11699312-ea2a-42c3-8614-1593b6275d91","Type":"ContainerDied","Data":"fb22a2d1bd0f8f5d12298492e7d23e38ba4b38101d72a4dee76e1b07e8726183"} Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.984125 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb22a2d1bd0f8f5d12298492e7d23e38ba4b38101d72a4dee76e1b07e8726183" Dec 03 12:47:34 crc kubenswrapper[4711]: I1203 12:47:34.983871 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-2gmg2" Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.151511 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.152089 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="541bebd5-0485-4874-98e4-39ebf4267705" containerName="glance-log" containerID="cri-o://118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683" gracePeriod=30 Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.152219 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="541bebd5-0485-4874-98e4-39ebf4267705" containerName="glance-httpd" containerID="cri-o://490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717" gracePeriod=30 Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.161525 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.161951 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerName="glance-log" containerID="cri-o://d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955" gracePeriod=30 Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.162090 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerName="glance-httpd" containerID="cri-o://4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba" gracePeriod=30 Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.992234 4711 generic.go:334] "Generic (PLEG): container finished" podID="541bebd5-0485-4874-98e4-39ebf4267705" containerID="118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683" exitCode=143 Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.992326 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"541bebd5-0485-4874-98e4-39ebf4267705","Type":"ContainerDied","Data":"118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683"} Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.996052 4711 generic.go:334] "Generic (PLEG): container finished" podID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerID="d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955" exitCode=143 Dec 03 12:47:35 crc kubenswrapper[4711]: I1203 12:47:35.996087 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4","Type":"ContainerDied","Data":"d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955"} Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.712737 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.714456 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777486 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-var-locks-brick\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777534 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-run\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777563 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-sys\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777591 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777620 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-config-data\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777641 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-scripts\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777673 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-dev\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777677 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-run" (OuterVolumeSpecName: "run") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777709 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-httpd-run\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777735 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-sys" (OuterVolumeSpecName: "sys") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777741 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777765 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-lib-modules\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777810 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-run\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777823 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-dev" (OuterVolumeSpecName: "dev") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777856 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-scripts\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777892 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777941 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-run" (OuterVolumeSpecName: "run") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777901 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-var-locks-brick\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777983 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.777992 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-config-data\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778024 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778090 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-httpd-run\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778126 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-iscsi\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778177 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-iscsi\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778249 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778295 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-sys\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778357 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778385 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-sys" (OuterVolumeSpecName: "sys") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778322 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kdk8\" (UniqueName: \"kubernetes.io/projected/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-kube-api-access-6kdk8\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778416 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778481 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778483 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778578 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-dev" (OuterVolumeSpecName: "dev") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778445 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-dev\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778800 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-lib-modules\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778828 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df4jv\" (UniqueName: \"kubernetes.io/projected/541bebd5-0485-4874-98e4-39ebf4267705-kube-api-access-df4jv\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778853 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-logs\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778875 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-nvme\") pod \"541bebd5-0485-4874-98e4-39ebf4267705\" (UID: \"541bebd5-0485-4874-98e4-39ebf4267705\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778934 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.778975 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-logs\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779004 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-nvme\") pod \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\" (UID: \"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4\") " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779352 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-logs" (OuterVolumeSpecName: "logs") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779388 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779527 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779619 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779630 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779641 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779649 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779656 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779664 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779672 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779679 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779687 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541bebd5-0485-4874-98e4-39ebf4267705-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779694 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779703 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779716 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779730 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779741 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779782 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-logs" (OuterVolumeSpecName: "logs") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779817 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.779847 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.783584 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.783597 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-kube-api-access-6kdk8" (OuterVolumeSpecName: "kube-api-access-6kdk8") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "kube-api-access-6kdk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.783722 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-scripts" (OuterVolumeSpecName: "scripts") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.783807 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-scripts" (OuterVolumeSpecName: "scripts") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.784327 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.784671 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.785114 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541bebd5-0485-4874-98e4-39ebf4267705-kube-api-access-df4jv" (OuterVolumeSpecName: "kube-api-access-df4jv") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "kube-api-access-df4jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.785348 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.821134 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-config-data" (OuterVolumeSpecName: "config-data") pod "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" (UID: "8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.822013 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-config-data" (OuterVolumeSpecName: "config-data") pod "541bebd5-0485-4874-98e4-39ebf4267705" (UID: "541bebd5-0485-4874-98e4-39ebf4267705"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881126 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881180 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881193 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kdk8\" (UniqueName: \"kubernetes.io/projected/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-kube-api-access-6kdk8\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881206 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df4jv\" (UniqueName: \"kubernetes.io/projected/541bebd5-0485-4874-98e4-39ebf4267705-kube-api-access-df4jv\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881216 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/541bebd5-0485-4874-98e4-39ebf4267705-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881228 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881237 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881245 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881257 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881266 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881273 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541bebd5-0485-4874-98e4-39ebf4267705-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881286 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.881294 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.893856 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.894460 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.895572 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.897220 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.982207 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.982247 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.982259 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:38 crc kubenswrapper[4711]: I1203 12:47:38.982270 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.026608 4711 generic.go:334] "Generic (PLEG): container finished" podID="541bebd5-0485-4874-98e4-39ebf4267705" containerID="490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717" exitCode=0 Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.026668 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.026691 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"541bebd5-0485-4874-98e4-39ebf4267705","Type":"ContainerDied","Data":"490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717"} Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.026721 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"541bebd5-0485-4874-98e4-39ebf4267705","Type":"ContainerDied","Data":"e5fa304fa020bd9569fdf63137a693e80600fadc39c2bf0d9b02a4967b763f64"} Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.026745 4711 scope.go:117] "RemoveContainer" containerID="490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.029988 4711 generic.go:334] "Generic (PLEG): container finished" podID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerID="4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba" exitCode=0 Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.030084 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.030110 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4","Type":"ContainerDied","Data":"4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba"} Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.031124 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4","Type":"ContainerDied","Data":"9fdf7af2325e174ef4e5fe913d19297e2213d677a702c122323bea1c6fa21097"} Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.067069 4711 scope.go:117] "RemoveContainer" containerID="118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.081775 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.089476 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.094889 4711 scope.go:117] "RemoveContainer" containerID="490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717" Dec 03 12:47:39 crc kubenswrapper[4711]: E1203 12:47:39.095508 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717\": container with ID starting with 490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717 not found: ID does not exist" containerID="490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.095543 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717"} err="failed to get container status \"490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717\": rpc error: code = NotFound desc = could not find container \"490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717\": container with ID starting with 490058e68fbfe9457a54317fd97eaba6ca0b048b09da485f630e738158273717 not found: ID does not exist" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.095570 4711 scope.go:117] "RemoveContainer" containerID="118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.095866 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:47:39 crc kubenswrapper[4711]: E1203 12:47:39.096105 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683\": container with ID starting with 118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683 not found: ID does not exist" containerID="118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.096134 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683"} err="failed to get container status \"118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683\": rpc error: code = NotFound desc = could not find container \"118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683\": container with ID starting with 118950ab3b2968b82e66aca09de7dd3cf7dd3cc26100e70781f958d35bded683 not found: ID does not exist" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.096154 4711 scope.go:117] "RemoveContainer" containerID="4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.102670 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.116956 4711 scope.go:117] "RemoveContainer" containerID="d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.131828 4711 scope.go:117] "RemoveContainer" containerID="4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba" Dec 03 12:47:39 crc kubenswrapper[4711]: E1203 12:47:39.132365 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba\": container with ID starting with 4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba not found: ID does not exist" containerID="4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.132407 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba"} err="failed to get container status \"4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba\": rpc error: code = NotFound desc = could not find container \"4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba\": container with ID starting with 4c0349e5546f5e33df4202906637df250bdf6976507e8c69547c28a72a7d5aba not found: ID does not exist" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.132434 4711 scope.go:117] "RemoveContainer" containerID="d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955" Dec 03 12:47:39 crc kubenswrapper[4711]: E1203 12:47:39.132785 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955\": container with ID starting with d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955 not found: ID does not exist" containerID="d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.132823 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955"} err="failed to get container status \"d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955\": rpc error: code = NotFound desc = could not find container \"d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955\": container with ID starting with d16d24019a8305850ba34856149f545d3b8b959abc504a81cce4458cc1264955 not found: ID does not exist" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.828532 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541bebd5-0485-4874-98e4-39ebf4267705" path="/var/lib/kubelet/pods/541bebd5-0485-4874-98e4-39ebf4267705/volumes" Dec 03 12:47:39 crc kubenswrapper[4711]: I1203 12:47:39.831354 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" path="/var/lib/kubelet/pods/8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4/volumes" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.502191 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:40 crc kubenswrapper[4711]: E1203 12:47:40.502600 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541bebd5-0485-4874-98e4-39ebf4267705" containerName="glance-httpd" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.502622 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="541bebd5-0485-4874-98e4-39ebf4267705" containerName="glance-httpd" Dec 03 12:47:40 crc kubenswrapper[4711]: E1203 12:47:40.502644 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerName="glance-log" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.502655 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerName="glance-log" Dec 03 12:47:40 crc kubenswrapper[4711]: E1203 12:47:40.502675 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerName="glance-httpd" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.502685 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerName="glance-httpd" Dec 03 12:47:40 crc kubenswrapper[4711]: E1203 12:47:40.502705 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541bebd5-0485-4874-98e4-39ebf4267705" containerName="glance-log" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.502715 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="541bebd5-0485-4874-98e4-39ebf4267705" containerName="glance-log" Dec 03 12:47:40 crc kubenswrapper[4711]: E1203 12:47:40.502731 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11699312-ea2a-42c3-8614-1593b6275d91" containerName="glance-db-sync" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.502742 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="11699312-ea2a-42c3-8614-1593b6275d91" containerName="glance-db-sync" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.502979 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="541bebd5-0485-4874-98e4-39ebf4267705" containerName="glance-log" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.503089 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="11699312-ea2a-42c3-8614-1593b6275d91" containerName="glance-db-sync" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.503107 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerName="glance-log" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.503122 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="541bebd5-0485-4874-98e4-39ebf4267705" containerName="glance-httpd" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.503142 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdf7796-6e30-4ac7-9bcd-6ca3a398d2d4" containerName="glance-httpd" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.504258 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.506503 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.506755 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.507242 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-nlh2p" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.508587 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.508760 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.510632 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.516602 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.608888 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.609283 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.609441 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-logs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.609526 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-httpd-run\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.609611 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.609722 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.609803 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brgfz\" (UniqueName: \"kubernetes.io/projected/fc7495ec-7cd1-4aaf-97a3-0a466f344832-kube-api-access-brgfz\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.609952 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-scripts\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.610039 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-config-data\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712146 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712228 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712287 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-logs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712316 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-httpd-run\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712344 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712400 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712427 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brgfz\" (UniqueName: \"kubernetes.io/projected/fc7495ec-7cd1-4aaf-97a3-0a466f344832-kube-api-access-brgfz\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712490 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-scripts\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.712519 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-config-data\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.713644 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-httpd-run\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.713871 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-logs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.713875 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.719147 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.721539 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-config-data\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.721224 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.721868 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.722651 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-scripts\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.735947 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brgfz\" (UniqueName: \"kubernetes.io/projected/fc7495ec-7cd1-4aaf-97a3-0a466f344832-kube-api-access-brgfz\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.739379 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:40 crc kubenswrapper[4711]: I1203 12:47:40.825718 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:41 crc kubenswrapper[4711]: I1203 12:47:41.245715 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:42 crc kubenswrapper[4711]: I1203 12:47:42.060457 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"fc7495ec-7cd1-4aaf-97a3-0a466f344832","Type":"ContainerStarted","Data":"23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757"} Dec 03 12:47:42 crc kubenswrapper[4711]: I1203 12:47:42.060890 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"fc7495ec-7cd1-4aaf-97a3-0a466f344832","Type":"ContainerStarted","Data":"7fb8e83056c9dec09f8920cfe2d903354f7a61e3bb03a978d059e84cdf83a59c"} Dec 03 12:47:43 crc kubenswrapper[4711]: I1203 12:47:43.072525 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"fc7495ec-7cd1-4aaf-97a3-0a466f344832","Type":"ContainerStarted","Data":"ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873"} Dec 03 12:47:43 crc kubenswrapper[4711]: I1203 12:47:43.104683 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.104652543 podStartE2EDuration="3.104652543s" podCreationTimestamp="2025-12-03 12:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:43.098498944 +0000 UTC m=+1981.767750249" watchObservedRunningTime="2025-12-03 12:47:43.104652543 +0000 UTC m=+1981.773903818" Dec 03 12:47:50 crc kubenswrapper[4711]: I1203 12:47:50.826789 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:50 crc kubenswrapper[4711]: I1203 12:47:50.827502 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:50 crc kubenswrapper[4711]: I1203 12:47:50.876763 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:50 crc kubenswrapper[4711]: I1203 12:47:50.888400 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:51 crc kubenswrapper[4711]: I1203 12:47:51.148346 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:51 crc kubenswrapper[4711]: I1203 12:47:51.148398 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:53 crc kubenswrapper[4711]: I1203 12:47:53.045202 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:53 crc kubenswrapper[4711]: I1203 12:47:53.125407 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.447194 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-2gmg2"] Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.462673 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-2gmg2"] Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.499801 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance3e42-account-delete-4ndgb"] Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.500685 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.523825 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aae18e8-b194-4173-8587-2abd0b168c88-operator-scripts\") pod \"glance3e42-account-delete-4ndgb\" (UID: \"8aae18e8-b194-4173-8587-2abd0b168c88\") " pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.523887 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7zwf\" (UniqueName: \"kubernetes.io/projected/8aae18e8-b194-4173-8587-2abd0b168c88-kube-api-access-g7zwf\") pod \"glance3e42-account-delete-4ndgb\" (UID: \"8aae18e8-b194-4173-8587-2abd0b168c88\") " pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.541160 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3e42-account-delete-4ndgb"] Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.567598 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.625611 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7zwf\" (UniqueName: \"kubernetes.io/projected/8aae18e8-b194-4173-8587-2abd0b168c88-kube-api-access-g7zwf\") pod \"glance3e42-account-delete-4ndgb\" (UID: \"8aae18e8-b194-4173-8587-2abd0b168c88\") " pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.625748 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aae18e8-b194-4173-8587-2abd0b168c88-operator-scripts\") pod \"glance3e42-account-delete-4ndgb\" (UID: \"8aae18e8-b194-4173-8587-2abd0b168c88\") " pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.626458 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aae18e8-b194-4173-8587-2abd0b168c88-operator-scripts\") pod \"glance3e42-account-delete-4ndgb\" (UID: \"8aae18e8-b194-4173-8587-2abd0b168c88\") " pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.648608 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7zwf\" (UniqueName: \"kubernetes.io/projected/8aae18e8-b194-4173-8587-2abd0b168c88-kube-api-access-g7zwf\") pod \"glance3e42-account-delete-4ndgb\" (UID: \"8aae18e8-b194-4173-8587-2abd0b168c88\") " pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:54 crc kubenswrapper[4711]: I1203 12:47:54.821821 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:55 crc kubenswrapper[4711]: I1203 12:47:55.194498 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerName="glance-log" containerID="cri-o://23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757" gracePeriod=30 Dec 03 12:47:55 crc kubenswrapper[4711]: I1203 12:47:55.194618 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerName="glance-httpd" containerID="cri-o://ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873" gracePeriod=30 Dec 03 12:47:55 crc kubenswrapper[4711]: I1203 12:47:55.259536 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3e42-account-delete-4ndgb"] Dec 03 12:47:55 crc kubenswrapper[4711]: W1203 12:47:55.259977 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aae18e8_b194_4173_8587_2abd0b168c88.slice/crio-d89808c8164ee4738ba42465178a135f0c97433a551f704060e2c2bea58e3792 WatchSource:0}: Error finding container d89808c8164ee4738ba42465178a135f0c97433a551f704060e2c2bea58e3792: Status 404 returned error can't find the container with id d89808c8164ee4738ba42465178a135f0c97433a551f704060e2c2bea58e3792 Dec 03 12:47:55 crc kubenswrapper[4711]: I1203 12:47:55.826300 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11699312-ea2a-42c3-8614-1593b6275d91" path="/var/lib/kubelet/pods/11699312-ea2a-42c3-8614-1593b6275d91/volumes" Dec 03 12:47:56 crc kubenswrapper[4711]: I1203 12:47:56.205433 4711 generic.go:334] "Generic (PLEG): container finished" podID="8aae18e8-b194-4173-8587-2abd0b168c88" containerID="261c06a2564639d9ed8c4652ec49d459a501d4bcdd68ee33601177a986a52a39" exitCode=0 Dec 03 12:47:56 crc kubenswrapper[4711]: I1203 12:47:56.205563 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" event={"ID":"8aae18e8-b194-4173-8587-2abd0b168c88","Type":"ContainerDied","Data":"261c06a2564639d9ed8c4652ec49d459a501d4bcdd68ee33601177a986a52a39"} Dec 03 12:47:56 crc kubenswrapper[4711]: I1203 12:47:56.205624 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" event={"ID":"8aae18e8-b194-4173-8587-2abd0b168c88","Type":"ContainerStarted","Data":"d89808c8164ee4738ba42465178a135f0c97433a551f704060e2c2bea58e3792"} Dec 03 12:47:56 crc kubenswrapper[4711]: I1203 12:47:56.211473 4711 generic.go:334] "Generic (PLEG): container finished" podID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerID="23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757" exitCode=143 Dec 03 12:47:56 crc kubenswrapper[4711]: I1203 12:47:56.211527 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"fc7495ec-7cd1-4aaf-97a3-0a466f344832","Type":"ContainerDied","Data":"23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757"} Dec 03 12:47:57 crc kubenswrapper[4711]: I1203 12:47:57.485535 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:57 crc kubenswrapper[4711]: I1203 12:47:57.667398 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7zwf\" (UniqueName: \"kubernetes.io/projected/8aae18e8-b194-4173-8587-2abd0b168c88-kube-api-access-g7zwf\") pod \"8aae18e8-b194-4173-8587-2abd0b168c88\" (UID: \"8aae18e8-b194-4173-8587-2abd0b168c88\") " Dec 03 12:47:57 crc kubenswrapper[4711]: I1203 12:47:57.668171 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aae18e8-b194-4173-8587-2abd0b168c88-operator-scripts\") pod \"8aae18e8-b194-4173-8587-2abd0b168c88\" (UID: \"8aae18e8-b194-4173-8587-2abd0b168c88\") " Dec 03 12:47:57 crc kubenswrapper[4711]: I1203 12:47:57.668766 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aae18e8-b194-4173-8587-2abd0b168c88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8aae18e8-b194-4173-8587-2abd0b168c88" (UID: "8aae18e8-b194-4173-8587-2abd0b168c88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:57 crc kubenswrapper[4711]: I1203 12:47:57.674547 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aae18e8-b194-4173-8587-2abd0b168c88-kube-api-access-g7zwf" (OuterVolumeSpecName: "kube-api-access-g7zwf") pod "8aae18e8-b194-4173-8587-2abd0b168c88" (UID: "8aae18e8-b194-4173-8587-2abd0b168c88"). InnerVolumeSpecName "kube-api-access-g7zwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:57 crc kubenswrapper[4711]: I1203 12:47:57.769515 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aae18e8-b194-4173-8587-2abd0b168c88-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:57 crc kubenswrapper[4711]: I1203 12:47:57.769561 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7zwf\" (UniqueName: \"kubernetes.io/projected/8aae18e8-b194-4173-8587-2abd0b168c88-kube-api-access-g7zwf\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.229529 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" event={"ID":"8aae18e8-b194-4173-8587-2abd0b168c88","Type":"ContainerDied","Data":"d89808c8164ee4738ba42465178a135f0c97433a551f704060e2c2bea58e3792"} Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.229576 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89808c8164ee4738ba42465178a135f0c97433a551f704060e2c2bea58e3792" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.229579 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3e42-account-delete-4ndgb" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.726283 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.782758 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.782817 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-httpd-run\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.782858 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-config-data\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.782876 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-logs\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.782930 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-combined-ca-bundle\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.782949 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-public-tls-certs\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.782994 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-scripts\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.783011 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-internal-tls-certs\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.783682 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.783982 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-logs" (OuterVolumeSpecName: "logs") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.786989 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-scripts" (OuterVolumeSpecName: "scripts") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.787092 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.802447 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.816982 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.821077 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.842925 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-config-data" (OuterVolumeSpecName: "config-data") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.883700 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brgfz\" (UniqueName: \"kubernetes.io/projected/fc7495ec-7cd1-4aaf-97a3-0a466f344832-kube-api-access-brgfz\") pod \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\" (UID: \"fc7495ec-7cd1-4aaf-97a3-0a466f344832\") " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.884351 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.884382 4711 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.884402 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.884411 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.884420 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.884429 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7495ec-7cd1-4aaf-97a3-0a466f344832-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.884436 4711 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.884444 4711 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7495ec-7cd1-4aaf-97a3-0a466f344832-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.886713 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7495ec-7cd1-4aaf-97a3-0a466f344832-kube-api-access-brgfz" (OuterVolumeSpecName: "kube-api-access-brgfz") pod "fc7495ec-7cd1-4aaf-97a3-0a466f344832" (UID: "fc7495ec-7cd1-4aaf-97a3-0a466f344832"). InnerVolumeSpecName "kube-api-access-brgfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.896941 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.985463 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brgfz\" (UniqueName: \"kubernetes.io/projected/fc7495ec-7cd1-4aaf-97a3-0a466f344832-kube-api-access-brgfz\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4711]: I1203 12:47:58.985505 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.238986 4711 generic.go:334] "Generic (PLEG): container finished" podID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerID="ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873" exitCode=0 Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.239019 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.239034 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"fc7495ec-7cd1-4aaf-97a3-0a466f344832","Type":"ContainerDied","Data":"ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873"} Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.239071 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"fc7495ec-7cd1-4aaf-97a3-0a466f344832","Type":"ContainerDied","Data":"7fb8e83056c9dec09f8920cfe2d903354f7a61e3bb03a978d059e84cdf83a59c"} Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.239095 4711 scope.go:117] "RemoveContainer" containerID="ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.268020 4711 scope.go:117] "RemoveContainer" containerID="23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.290429 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.298228 4711 scope.go:117] "RemoveContainer" containerID="ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873" Dec 03 12:47:59 crc kubenswrapper[4711]: E1203 12:47:59.298591 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873\": container with ID starting with ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873 not found: ID does not exist" containerID="ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.298626 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873"} err="failed to get container status \"ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873\": rpc error: code = NotFound desc = could not find container \"ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873\": container with ID starting with ba7c7cd4c1f1fa1b4a3485e8b9eb17ae3467263a87dd27ccc3b2df361ab14873 not found: ID does not exist" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.298647 4711 scope.go:117] "RemoveContainer" containerID="23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757" Dec 03 12:47:59 crc kubenswrapper[4711]: E1203 12:47:59.298895 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757\": container with ID starting with 23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757 not found: ID does not exist" containerID="23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.298931 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757"} err="failed to get container status \"23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757\": rpc error: code = NotFound desc = could not find container \"23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757\": container with ID starting with 23f91067343a0b1bab9dfa72ddb2d9a8b4d86f8c19a49847fde4d08eedeab757 not found: ID does not exist" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.301161 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.523981 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-q2b5k"] Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.536380 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-q2b5k"] Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.543073 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance3e42-account-delete-4ndgb"] Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.548197 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-3e42-account-create-update-jgzx7"] Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.552680 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance3e42-account-delete-4ndgb"] Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.557563 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-3e42-account-create-update-jgzx7"] Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.828468 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318f131d-3285-4143-bb7c-495038d90363" path="/var/lib/kubelet/pods/318f131d-3285-4143-bb7c-495038d90363/volumes" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.828982 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784295a4-13fe-4196-adb7-a007bf0c487e" path="/var/lib/kubelet/pods/784295a4-13fe-4196-adb7-a007bf0c487e/volumes" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.829438 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aae18e8-b194-4173-8587-2abd0b168c88" path="/var/lib/kubelet/pods/8aae18e8-b194-4173-8587-2abd0b168c88/volumes" Dec 03 12:47:59 crc kubenswrapper[4711]: I1203 12:47:59.830400 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" path="/var/lib/kubelet/pods/fc7495ec-7cd1-4aaf-97a3-0a466f344832/volumes" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.002494 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-kv5n2"] Dec 03 12:48:01 crc kubenswrapper[4711]: E1203 12:48:01.002795 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aae18e8-b194-4173-8587-2abd0b168c88" containerName="mariadb-account-delete" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.002808 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aae18e8-b194-4173-8587-2abd0b168c88" containerName="mariadb-account-delete" Dec 03 12:48:01 crc kubenswrapper[4711]: E1203 12:48:01.002817 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerName="glance-httpd" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.002823 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerName="glance-httpd" Dec 03 12:48:01 crc kubenswrapper[4711]: E1203 12:48:01.002842 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerName="glance-log" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.002849 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerName="glance-log" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.003071 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aae18e8-b194-4173-8587-2abd0b168c88" containerName="mariadb-account-delete" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.003083 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerName="glance-log" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.003091 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7495ec-7cd1-4aaf-97a3-0a466f344832" containerName="glance-httpd" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.003548 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.014621 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-kv5n2"] Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.017854 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgp5z\" (UniqueName: \"kubernetes.io/projected/867531ae-9b99-4428-a807-54069dfb9da9-kube-api-access-kgp5z\") pod \"glance-db-create-kv5n2\" (UID: \"867531ae-9b99-4428-a807-54069dfb9da9\") " pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.018116 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867531ae-9b99-4428-a807-54069dfb9da9-operator-scripts\") pod \"glance-db-create-kv5n2\" (UID: \"867531ae-9b99-4428-a807-54069dfb9da9\") " pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.098817 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2"] Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.099695 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.102372 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.109710 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2"] Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.118985 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867531ae-9b99-4428-a807-54069dfb9da9-operator-scripts\") pod \"glance-db-create-kv5n2\" (UID: \"867531ae-9b99-4428-a807-54069dfb9da9\") " pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.119054 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqg2t\" (UniqueName: \"kubernetes.io/projected/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-kube-api-access-sqg2t\") pod \"glance-a0cd-account-create-update-wn2w2\" (UID: \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\") " pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.119088 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-operator-scripts\") pod \"glance-a0cd-account-create-update-wn2w2\" (UID: \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\") " pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.119136 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgp5z\" (UniqueName: \"kubernetes.io/projected/867531ae-9b99-4428-a807-54069dfb9da9-kube-api-access-kgp5z\") pod \"glance-db-create-kv5n2\" (UID: \"867531ae-9b99-4428-a807-54069dfb9da9\") " pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.120203 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867531ae-9b99-4428-a807-54069dfb9da9-operator-scripts\") pod \"glance-db-create-kv5n2\" (UID: \"867531ae-9b99-4428-a807-54069dfb9da9\") " pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.140869 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgp5z\" (UniqueName: \"kubernetes.io/projected/867531ae-9b99-4428-a807-54069dfb9da9-kube-api-access-kgp5z\") pod \"glance-db-create-kv5n2\" (UID: \"867531ae-9b99-4428-a807-54069dfb9da9\") " pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.220581 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-operator-scripts\") pod \"glance-a0cd-account-create-update-wn2w2\" (UID: \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\") " pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.220777 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqg2t\" (UniqueName: \"kubernetes.io/projected/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-kube-api-access-sqg2t\") pod \"glance-a0cd-account-create-update-wn2w2\" (UID: \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\") " pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.221481 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-operator-scripts\") pod \"glance-a0cd-account-create-update-wn2w2\" (UID: \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\") " pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.238376 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqg2t\" (UniqueName: \"kubernetes.io/projected/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-kube-api-access-sqg2t\") pod \"glance-a0cd-account-create-update-wn2w2\" (UID: \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\") " pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.319793 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.416810 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.861788 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2"] Dec 03 12:48:01 crc kubenswrapper[4711]: W1203 12:48:01.864861 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea294a21_cc69_4aa6_aa5e_51b4f7c86b68.slice/crio-377a95834f12dba72601fa9e2a2eda8d563805cb3c59ecf3edc0f0c66c817216 WatchSource:0}: Error finding container 377a95834f12dba72601fa9e2a2eda8d563805cb3c59ecf3edc0f0c66c817216: Status 404 returned error can't find the container with id 377a95834f12dba72601fa9e2a2eda8d563805cb3c59ecf3edc0f0c66c817216 Dec 03 12:48:01 crc kubenswrapper[4711]: I1203 12:48:01.867844 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-kv5n2"] Dec 03 12:48:01 crc kubenswrapper[4711]: W1203 12:48:01.868266 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod867531ae_9b99_4428_a807_54069dfb9da9.slice/crio-5036431ddb6c5d53041e4fbe22d267e3cd9c72349f0a1255705eaf62389e01fa WatchSource:0}: Error finding container 5036431ddb6c5d53041e4fbe22d267e3cd9c72349f0a1255705eaf62389e01fa: Status 404 returned error can't find the container with id 5036431ddb6c5d53041e4fbe22d267e3cd9c72349f0a1255705eaf62389e01fa Dec 03 12:48:02 crc kubenswrapper[4711]: I1203 12:48:02.270894 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" event={"ID":"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68","Type":"ContainerStarted","Data":"9fa0d8c99d2c47b1f770f04652f96fd8b76b35a6947e82f61bb8923cccfd6a3e"} Dec 03 12:48:02 crc kubenswrapper[4711]: I1203 12:48:02.271208 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" event={"ID":"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68","Type":"ContainerStarted","Data":"377a95834f12dba72601fa9e2a2eda8d563805cb3c59ecf3edc0f0c66c817216"} Dec 03 12:48:02 crc kubenswrapper[4711]: I1203 12:48:02.273136 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-kv5n2" event={"ID":"867531ae-9b99-4428-a807-54069dfb9da9","Type":"ContainerStarted","Data":"dd72ac794d49686367b52e03b054cb0cd4ccad2c13c72a2725e71b582a673e8a"} Dec 03 12:48:02 crc kubenswrapper[4711]: I1203 12:48:02.273163 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-kv5n2" event={"ID":"867531ae-9b99-4428-a807-54069dfb9da9","Type":"ContainerStarted","Data":"5036431ddb6c5d53041e4fbe22d267e3cd9c72349f0a1255705eaf62389e01fa"} Dec 03 12:48:02 crc kubenswrapper[4711]: I1203 12:48:02.292841 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" podStartSLOduration=1.292819479 podStartE2EDuration="1.292819479s" podCreationTimestamp="2025-12-03 12:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:02.285988642 +0000 UTC m=+2000.955239917" watchObservedRunningTime="2025-12-03 12:48:02.292819479 +0000 UTC m=+2000.962070734" Dec 03 12:48:02 crc kubenswrapper[4711]: I1203 12:48:02.306311 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-kv5n2" podStartSLOduration=2.3062934 podStartE2EDuration="2.3062934s" podCreationTimestamp="2025-12-03 12:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:02.299718109 +0000 UTC m=+2000.968969374" watchObservedRunningTime="2025-12-03 12:48:02.3062934 +0000 UTC m=+2000.975544655" Dec 03 12:48:03 crc kubenswrapper[4711]: I1203 12:48:03.281412 4711 generic.go:334] "Generic (PLEG): container finished" podID="ea294a21-cc69-4aa6-aa5e-51b4f7c86b68" containerID="9fa0d8c99d2c47b1f770f04652f96fd8b76b35a6947e82f61bb8923cccfd6a3e" exitCode=0 Dec 03 12:48:03 crc kubenswrapper[4711]: I1203 12:48:03.281470 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" event={"ID":"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68","Type":"ContainerDied","Data":"9fa0d8c99d2c47b1f770f04652f96fd8b76b35a6947e82f61bb8923cccfd6a3e"} Dec 03 12:48:03 crc kubenswrapper[4711]: I1203 12:48:03.283070 4711 generic.go:334] "Generic (PLEG): container finished" podID="867531ae-9b99-4428-a807-54069dfb9da9" containerID="dd72ac794d49686367b52e03b054cb0cd4ccad2c13c72a2725e71b582a673e8a" exitCode=0 Dec 03 12:48:03 crc kubenswrapper[4711]: I1203 12:48:03.283120 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-kv5n2" event={"ID":"867531ae-9b99-4428-a807-54069dfb9da9","Type":"ContainerDied","Data":"dd72ac794d49686367b52e03b054cb0cd4ccad2c13c72a2725e71b582a673e8a"} Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.627264 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.633287 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.771006 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgp5z\" (UniqueName: \"kubernetes.io/projected/867531ae-9b99-4428-a807-54069dfb9da9-kube-api-access-kgp5z\") pod \"867531ae-9b99-4428-a807-54069dfb9da9\" (UID: \"867531ae-9b99-4428-a807-54069dfb9da9\") " Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.771106 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867531ae-9b99-4428-a807-54069dfb9da9-operator-scripts\") pod \"867531ae-9b99-4428-a807-54069dfb9da9\" (UID: \"867531ae-9b99-4428-a807-54069dfb9da9\") " Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.771131 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqg2t\" (UniqueName: \"kubernetes.io/projected/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-kube-api-access-sqg2t\") pod \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\" (UID: \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\") " Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.771161 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-operator-scripts\") pod \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\" (UID: \"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68\") " Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.772091 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867531ae-9b99-4428-a807-54069dfb9da9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "867531ae-9b99-4428-a807-54069dfb9da9" (UID: "867531ae-9b99-4428-a807-54069dfb9da9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.772988 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea294a21-cc69-4aa6-aa5e-51b4f7c86b68" (UID: "ea294a21-cc69-4aa6-aa5e-51b4f7c86b68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.777771 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-kube-api-access-sqg2t" (OuterVolumeSpecName: "kube-api-access-sqg2t") pod "ea294a21-cc69-4aa6-aa5e-51b4f7c86b68" (UID: "ea294a21-cc69-4aa6-aa5e-51b4f7c86b68"). InnerVolumeSpecName "kube-api-access-sqg2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.777868 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867531ae-9b99-4428-a807-54069dfb9da9-kube-api-access-kgp5z" (OuterVolumeSpecName: "kube-api-access-kgp5z") pod "867531ae-9b99-4428-a807-54069dfb9da9" (UID: "867531ae-9b99-4428-a807-54069dfb9da9"). InnerVolumeSpecName "kube-api-access-kgp5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.872702 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgp5z\" (UniqueName: \"kubernetes.io/projected/867531ae-9b99-4428-a807-54069dfb9da9-kube-api-access-kgp5z\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.872746 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867531ae-9b99-4428-a807-54069dfb9da9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.872759 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqg2t\" (UniqueName: \"kubernetes.io/projected/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-kube-api-access-sqg2t\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:04 crc kubenswrapper[4711]: I1203 12:48:04.872771 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:05 crc kubenswrapper[4711]: I1203 12:48:05.298206 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" Dec 03 12:48:05 crc kubenswrapper[4711]: I1203 12:48:05.298217 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2" event={"ID":"ea294a21-cc69-4aa6-aa5e-51b4f7c86b68","Type":"ContainerDied","Data":"377a95834f12dba72601fa9e2a2eda8d563805cb3c59ecf3edc0f0c66c817216"} Dec 03 12:48:05 crc kubenswrapper[4711]: I1203 12:48:05.298631 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="377a95834f12dba72601fa9e2a2eda8d563805cb3c59ecf3edc0f0c66c817216" Dec 03 12:48:05 crc kubenswrapper[4711]: I1203 12:48:05.299822 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-kv5n2" event={"ID":"867531ae-9b99-4428-a807-54069dfb9da9","Type":"ContainerDied","Data":"5036431ddb6c5d53041e4fbe22d267e3cd9c72349f0a1255705eaf62389e01fa"} Dec 03 12:48:05 crc kubenswrapper[4711]: I1203 12:48:05.299877 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5036431ddb6c5d53041e4fbe22d267e3cd9c72349f0a1255705eaf62389e01fa" Dec 03 12:48:05 crc kubenswrapper[4711]: I1203 12:48:05.299882 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-kv5n2" Dec 03 12:48:05 crc kubenswrapper[4711]: I1203 12:48:05.401419 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:48:05 crc kubenswrapper[4711]: I1203 12:48:05.401479 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.226734 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-rhspb"] Dec 03 12:48:06 crc kubenswrapper[4711]: E1203 12:48:06.227976 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867531ae-9b99-4428-a807-54069dfb9da9" containerName="mariadb-database-create" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.228067 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="867531ae-9b99-4428-a807-54069dfb9da9" containerName="mariadb-database-create" Dec 03 12:48:06 crc kubenswrapper[4711]: E1203 12:48:06.228138 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea294a21-cc69-4aa6-aa5e-51b4f7c86b68" containerName="mariadb-account-create-update" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.228190 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea294a21-cc69-4aa6-aa5e-51b4f7c86b68" containerName="mariadb-account-create-update" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.228412 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea294a21-cc69-4aa6-aa5e-51b4f7c86b68" containerName="mariadb-account-create-update" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.228503 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="867531ae-9b99-4428-a807-54069dfb9da9" containerName="mariadb-database-create" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.229040 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.232281 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-nmhdr" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.235346 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.240264 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-rhspb"] Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.397852 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wx9v\" (UniqueName: \"kubernetes.io/projected/e20e724b-0cd7-4c5a-813d-d814e2655a03-kube-api-access-8wx9v\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.398040 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-config-data\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.398184 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-db-sync-config-data\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.499988 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wx9v\" (UniqueName: \"kubernetes.io/projected/e20e724b-0cd7-4c5a-813d-d814e2655a03-kube-api-access-8wx9v\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.500088 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-config-data\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.500158 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-db-sync-config-data\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.504476 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-db-sync-config-data\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.504882 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-config-data\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.518581 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wx9v\" (UniqueName: \"kubernetes.io/projected/e20e724b-0cd7-4c5a-813d-d814e2655a03-kube-api-access-8wx9v\") pod \"glance-db-sync-rhspb\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.547612 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:06 crc kubenswrapper[4711]: I1203 12:48:06.757133 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-rhspb"] Dec 03 12:48:07 crc kubenswrapper[4711]: I1203 12:48:07.314990 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-rhspb" event={"ID":"e20e724b-0cd7-4c5a-813d-d814e2655a03","Type":"ContainerStarted","Data":"bcf0a0545b10dccbd48345ce8b39c053d562c3978af8205f0307872a4ae3385a"} Dec 03 12:48:07 crc kubenswrapper[4711]: I1203 12:48:07.315307 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-rhspb" event={"ID":"e20e724b-0cd7-4c5a-813d-d814e2655a03","Type":"ContainerStarted","Data":"f71b3337d4e8181c790ae2604c7264a942a857bcf9a04cd7bdd2473af416c65c"} Dec 03 12:48:07 crc kubenswrapper[4711]: I1203 12:48:07.337942 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-rhspb" podStartSLOduration=1.337922342 podStartE2EDuration="1.337922342s" podCreationTimestamp="2025-12-03 12:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:07.330637923 +0000 UTC m=+2005.999889178" watchObservedRunningTime="2025-12-03 12:48:07.337922342 +0000 UTC m=+2006.007173597" Dec 03 12:48:10 crc kubenswrapper[4711]: I1203 12:48:10.343881 4711 generic.go:334] "Generic (PLEG): container finished" podID="e20e724b-0cd7-4c5a-813d-d814e2655a03" containerID="bcf0a0545b10dccbd48345ce8b39c053d562c3978af8205f0307872a4ae3385a" exitCode=0 Dec 03 12:48:10 crc kubenswrapper[4711]: I1203 12:48:10.343935 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-rhspb" event={"ID":"e20e724b-0cd7-4c5a-813d-d814e2655a03","Type":"ContainerDied","Data":"bcf0a0545b10dccbd48345ce8b39c053d562c3978af8205f0307872a4ae3385a"} Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.658338 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.776890 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wx9v\" (UniqueName: \"kubernetes.io/projected/e20e724b-0cd7-4c5a-813d-d814e2655a03-kube-api-access-8wx9v\") pod \"e20e724b-0cd7-4c5a-813d-d814e2655a03\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.777096 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-db-sync-config-data\") pod \"e20e724b-0cd7-4c5a-813d-d814e2655a03\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.777205 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-config-data\") pod \"e20e724b-0cd7-4c5a-813d-d814e2655a03\" (UID: \"e20e724b-0cd7-4c5a-813d-d814e2655a03\") " Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.782596 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e20e724b-0cd7-4c5a-813d-d814e2655a03" (UID: "e20e724b-0cd7-4c5a-813d-d814e2655a03"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.782941 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20e724b-0cd7-4c5a-813d-d814e2655a03-kube-api-access-8wx9v" (OuterVolumeSpecName: "kube-api-access-8wx9v") pod "e20e724b-0cd7-4c5a-813d-d814e2655a03" (UID: "e20e724b-0cd7-4c5a-813d-d814e2655a03"). InnerVolumeSpecName "kube-api-access-8wx9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.831357 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-config-data" (OuterVolumeSpecName: "config-data") pod "e20e724b-0cd7-4c5a-813d-d814e2655a03" (UID: "e20e724b-0cd7-4c5a-813d-d814e2655a03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.879585 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.879632 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wx9v\" (UniqueName: \"kubernetes.io/projected/e20e724b-0cd7-4c5a-813d-d814e2655a03-kube-api-access-8wx9v\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:11 crc kubenswrapper[4711]: I1203 12:48:11.879650 4711 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e20e724b-0cd7-4c5a-813d-d814e2655a03-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:12 crc kubenswrapper[4711]: I1203 12:48:12.358889 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-rhspb" event={"ID":"e20e724b-0cd7-4c5a-813d-d814e2655a03","Type":"ContainerDied","Data":"f71b3337d4e8181c790ae2604c7264a942a857bcf9a04cd7bdd2473af416c65c"} Dec 03 12:48:12 crc kubenswrapper[4711]: I1203 12:48:12.358950 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f71b3337d4e8181c790ae2604c7264a942a857bcf9a04cd7bdd2473af416c65c" Dec 03 12:48:12 crc kubenswrapper[4711]: I1203 12:48:12.358960 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-rhspb" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.503328 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:48:13 crc kubenswrapper[4711]: E1203 12:48:13.504135 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20e724b-0cd7-4c5a-813d-d814e2655a03" containerName="glance-db-sync" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.504147 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20e724b-0cd7-4c5a-813d-d814e2655a03" containerName="glance-db-sync" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.504274 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20e724b-0cd7-4c5a-813d-d814e2655a03" containerName="glance-db-sync" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.505398 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.507406 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-nmhdr" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.510641 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.511084 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.529346 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.604714 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.604774 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-config-data\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.604805 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.604834 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.604876 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.604930 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-logs\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.604957 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.604983 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxwcj\" (UniqueName: \"kubernetes.io/projected/030389c0-f4a6-4f89-9189-b9c685d44387-kube-api-access-mxwcj\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.605007 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.605030 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-scripts\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.605077 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-dev\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.605122 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-run\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.605153 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.605218 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-sys\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706223 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-logs\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706270 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706297 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxwcj\" (UniqueName: \"kubernetes.io/projected/030389c0-f4a6-4f89-9189-b9c685d44387-kube-api-access-mxwcj\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706320 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706339 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-scripts\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706361 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-dev\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706403 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-run\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706432 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706474 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-sys\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706506 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706534 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-config-data\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706554 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706574 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706590 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706657 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706720 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-logs\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706761 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706773 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-sys\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706800 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-run\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706805 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-dev\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.706822 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.707077 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.707351 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.707418 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.707531 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.713693 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-scripts\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.713986 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-config-data\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.725955 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxwcj\" (UniqueName: \"kubernetes.io/projected/030389c0-f4a6-4f89-9189-b9c685d44387-kube-api-access-mxwcj\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.731377 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.734330 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:13 crc kubenswrapper[4711]: I1203 12:48:13.822106 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.106847 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.108427 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.111177 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.130786 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.216839 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-dev\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.216893 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.216939 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.216960 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.216983 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9snnp\" (UniqueName: \"kubernetes.io/projected/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-kube-api-access-9snnp\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217050 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217076 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-run\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217092 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217109 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217211 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217231 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217247 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217287 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.217373 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-sys\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.281787 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319443 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-run\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319510 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319537 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319594 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319623 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319647 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319684 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319706 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-sys\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319748 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-dev\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319786 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319815 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319842 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319864 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9snnp\" (UniqueName: \"kubernetes.io/projected/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-kube-api-access-9snnp\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.319938 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.320565 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.320633 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-run\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.321493 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-sys\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.321563 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.321598 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.321642 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.321692 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-dev\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.321813 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.321955 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.322003 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.322403 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.328127 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.329453 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.339337 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9snnp\" (UniqueName: \"kubernetes.io/projected/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-kube-api-access-9snnp\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.343175 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.351497 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.374538 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"030389c0-f4a6-4f89-9189-b9c685d44387","Type":"ContainerStarted","Data":"2429e2effbb987786d0a8ffd276d64151ed760b4b3b254bf9f46c0ee09dea45b"} Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.439980 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:14 crc kubenswrapper[4711]: I1203 12:48:14.481288 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:15 crc kubenswrapper[4711]: I1203 12:48:15.156885 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:15 crc kubenswrapper[4711]: I1203 12:48:15.383245 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"030389c0-f4a6-4f89-9189-b9c685d44387","Type":"ContainerStarted","Data":"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7"} Dec 03 12:48:15 crc kubenswrapper[4711]: I1203 12:48:15.383288 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"030389c0-f4a6-4f89-9189-b9c685d44387","Type":"ContainerStarted","Data":"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac"} Dec 03 12:48:15 crc kubenswrapper[4711]: I1203 12:48:15.383300 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"030389c0-f4a6-4f89-9189-b9c685d44387","Type":"ContainerStarted","Data":"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477"} Dec 03 12:48:15 crc kubenswrapper[4711]: I1203 12:48:15.386280 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc","Type":"ContainerStarted","Data":"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e"} Dec 03 12:48:15 crc kubenswrapper[4711]: I1203 12:48:15.386392 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc","Type":"ContainerStarted","Data":"be4a6b4c0ff1d37d04c36f53e44e88a822a2c0daf3448169c508ea1080c22df0"} Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.396268 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc","Type":"ContainerStarted","Data":"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4"} Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.396778 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc","Type":"ContainerStarted","Data":"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714"} Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.397354 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-api" containerID="cri-o://6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4" gracePeriod=30 Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.397310 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-log" containerID="cri-o://d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e" gracePeriod=30 Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.397469 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-httpd" containerID="cri-o://db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714" gracePeriod=30 Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.444346 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.444327611 podStartE2EDuration="3.444327611s" podCreationTimestamp="2025-12-03 12:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:16.441392201 +0000 UTC m=+2015.110643546" watchObservedRunningTime="2025-12-03 12:48:16.444327611 +0000 UTC m=+2015.113578886" Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.449661 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.449642787 podStartE2EDuration="3.449642787s" podCreationTimestamp="2025-12-03 12:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:15.421654956 +0000 UTC m=+2014.090906231" watchObservedRunningTime="2025-12-03 12:48:16.449642787 +0000 UTC m=+2015.118894062" Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.882183 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999176 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-logs\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999211 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-dev\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999282 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-config-data\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999307 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-lib-modules\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999329 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-httpd-run\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999355 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999382 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-iscsi\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999431 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-var-locks-brick\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999466 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-nvme\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999494 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9snnp\" (UniqueName: \"kubernetes.io/projected/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-kube-api-access-9snnp\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999510 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-sys\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999535 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-run\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999556 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:16 crc kubenswrapper[4711]: I1203 12:48:16.999580 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-scripts\") pod \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\" (UID: \"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc\") " Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.000724 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.000767 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.000967 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-dev" (OuterVolumeSpecName: "dev") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.001016 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.001059 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.001062 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.001183 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-logs" (OuterVolumeSpecName: "logs") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.001365 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-sys" (OuterVolumeSpecName: "sys") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.001420 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-run" (OuterVolumeSpecName: "run") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.006148 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.006212 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-kube-api-access-9snnp" (OuterVolumeSpecName: "kube-api-access-9snnp") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "kube-api-access-9snnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.006214 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.008107 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-scripts" (OuterVolumeSpecName: "scripts") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.074071 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-config-data" (OuterVolumeSpecName: "config-data") pod "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" (UID: "ae99403a-512a-4d8c-9ae8-1c75c37f4ccc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.101865 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.101936 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.101958 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.101977 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9snnp\" (UniqueName: \"kubernetes.io/projected/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-kube-api-access-9snnp\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.101997 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.102014 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.102079 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.102098 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.104011 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.104061 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.104076 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.104090 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.104103 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.104157 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.118304 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.126615 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.205171 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.205209 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404553 4711 generic.go:334] "Generic (PLEG): container finished" podID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerID="6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4" exitCode=143 Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404591 4711 generic.go:334] "Generic (PLEG): container finished" podID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerID="db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714" exitCode=0 Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404604 4711 generic.go:334] "Generic (PLEG): container finished" podID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerID="d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e" exitCode=143 Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404628 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc","Type":"ContainerDied","Data":"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4"} Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404673 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc","Type":"ContainerDied","Data":"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714"} Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404687 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc","Type":"ContainerDied","Data":"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e"} Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404698 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"ae99403a-512a-4d8c-9ae8-1c75c37f4ccc","Type":"ContainerDied","Data":"be4a6b4c0ff1d37d04c36f53e44e88a822a2c0daf3448169c508ea1080c22df0"} Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404729 4711 scope.go:117] "RemoveContainer" containerID="6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.404887 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.430847 4711 scope.go:117] "RemoveContainer" containerID="db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.436864 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.443681 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.460110 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:17 crc kubenswrapper[4711]: E1203 12:48:17.460405 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-api" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.460422 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-api" Dec 03 12:48:17 crc kubenswrapper[4711]: E1203 12:48:17.460442 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-httpd" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.460448 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-httpd" Dec 03 12:48:17 crc kubenswrapper[4711]: E1203 12:48:17.460456 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-log" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.460462 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-log" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.460584 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-log" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.460597 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-api" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.460606 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" containerName="glance-httpd" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.462410 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.466921 4711 scope.go:117] "RemoveContainer" containerID="d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.467626 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.496240 4711 scope.go:117] "RemoveContainer" containerID="6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4" Dec 03 12:48:17 crc kubenswrapper[4711]: E1203 12:48:17.496888 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4\": container with ID starting with 6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4 not found: ID does not exist" containerID="6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.496954 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4"} err="failed to get container status \"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4\": rpc error: code = NotFound desc = could not find container \"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4\": container with ID starting with 6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4 not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.496995 4711 scope.go:117] "RemoveContainer" containerID="db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714" Dec 03 12:48:17 crc kubenswrapper[4711]: E1203 12:48:17.497999 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714\": container with ID starting with db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714 not found: ID does not exist" containerID="db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.498052 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714"} err="failed to get container status \"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714\": rpc error: code = NotFound desc = could not find container \"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714\": container with ID starting with db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714 not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.498088 4711 scope.go:117] "RemoveContainer" containerID="d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e" Dec 03 12:48:17 crc kubenswrapper[4711]: E1203 12:48:17.502826 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e\": container with ID starting with d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e not found: ID does not exist" containerID="d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.502889 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e"} err="failed to get container status \"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e\": rpc error: code = NotFound desc = could not find container \"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e\": container with ID starting with d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.502953 4711 scope.go:117] "RemoveContainer" containerID="6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.505328 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4"} err="failed to get container status \"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4\": rpc error: code = NotFound desc = could not find container \"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4\": container with ID starting with 6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4 not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.505373 4711 scope.go:117] "RemoveContainer" containerID="db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.505743 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714"} err="failed to get container status \"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714\": rpc error: code = NotFound desc = could not find container \"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714\": container with ID starting with db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714 not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.505770 4711 scope.go:117] "RemoveContainer" containerID="d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.510178 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e"} err="failed to get container status \"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e\": rpc error: code = NotFound desc = could not find container \"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e\": container with ID starting with d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.510225 4711 scope.go:117] "RemoveContainer" containerID="6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.516118 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4"} err="failed to get container status \"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4\": rpc error: code = NotFound desc = could not find container \"6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4\": container with ID starting with 6ccf5ca9cba3f0a415d384a7a9e6b3f3cb9f87122f588899f5695b360d0d74d4 not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.516176 4711 scope.go:117] "RemoveContainer" containerID="db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.520759 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.520930 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714"} err="failed to get container status \"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714\": rpc error: code = NotFound desc = could not find container \"db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714\": container with ID starting with db7e01aaa59e12ffe55aa39689a03a28b7abaecab8ec72a7804ece1b4be8e714 not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.520964 4711 scope.go:117] "RemoveContainer" containerID="d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.521649 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e"} err="failed to get container status \"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e\": rpc error: code = NotFound desc = could not find container \"d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e\": container with ID starting with d27098f21df3a3358ba870fe5fdfa4b0269e49c808c592c068f62629e72c3e9e not found: ID does not exist" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.623995 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-dev\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624049 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624074 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624112 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624145 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624163 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624176 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624226 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624253 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624269 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624308 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-sys\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624332 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8pt6\" (UniqueName: \"kubernetes.io/projected/7f815083-1f43-4c8d-9285-94b2aa696a09-kube-api-access-l8pt6\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624350 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.624371 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-run\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.726554 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-run\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.726982 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-dev\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.726699 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-run\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727019 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727089 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727145 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727208 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727234 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727253 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727348 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727385 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727412 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727477 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-sys\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727500 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8pt6\" (UniqueName: \"kubernetes.io/projected/7f815083-1f43-4c8d-9285-94b2aa696a09-kube-api-access-l8pt6\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727527 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727758 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727755 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-dev\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727890 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.727947 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.728180 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.728337 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.728472 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.729344 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.729383 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-sys\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.729501 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.733778 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.734936 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.762738 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8pt6\" (UniqueName: \"kubernetes.io/projected/7f815083-1f43-4c8d-9285-94b2aa696a09-kube-api-access-l8pt6\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.763103 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.765554 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.777018 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:17 crc kubenswrapper[4711]: I1203 12:48:17.827247 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae99403a-512a-4d8c-9ae8-1c75c37f4ccc" path="/var/lib/kubelet/pods/ae99403a-512a-4d8c-9ae8-1c75c37f4ccc/volumes" Dec 03 12:48:18 crc kubenswrapper[4711]: I1203 12:48:18.220410 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:18 crc kubenswrapper[4711]: I1203 12:48:18.418429 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7f815083-1f43-4c8d-9285-94b2aa696a09","Type":"ContainerStarted","Data":"878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52"} Dec 03 12:48:18 crc kubenswrapper[4711]: I1203 12:48:18.418477 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7f815083-1f43-4c8d-9285-94b2aa696a09","Type":"ContainerStarted","Data":"ae33bd94df8af2e2d8b570e425544c1be2eb9822b55fd5862ab58a833dc90587"} Dec 03 12:48:19 crc kubenswrapper[4711]: I1203 12:48:19.433592 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7f815083-1f43-4c8d-9285-94b2aa696a09","Type":"ContainerStarted","Data":"82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226"} Dec 03 12:48:19 crc kubenswrapper[4711]: I1203 12:48:19.434300 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7f815083-1f43-4c8d-9285-94b2aa696a09","Type":"ContainerStarted","Data":"008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba"} Dec 03 12:48:19 crc kubenswrapper[4711]: I1203 12:48:19.465535 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.465515381 podStartE2EDuration="2.465515381s" podCreationTimestamp="2025-12-03 12:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:19.460737961 +0000 UTC m=+2018.129989256" watchObservedRunningTime="2025-12-03 12:48:19.465515381 +0000 UTC m=+2018.134766636" Dec 03 12:48:23 crc kubenswrapper[4711]: I1203 12:48:23.828997 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:23 crc kubenswrapper[4711]: I1203 12:48:23.829461 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:23 crc kubenswrapper[4711]: I1203 12:48:23.829472 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:23 crc kubenswrapper[4711]: I1203 12:48:23.849718 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:23 crc kubenswrapper[4711]: I1203 12:48:23.851537 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:23 crc kubenswrapper[4711]: I1203 12:48:23.864512 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:24 crc kubenswrapper[4711]: I1203 12:48:24.472658 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:24 crc kubenswrapper[4711]: I1203 12:48:24.472723 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:24 crc kubenswrapper[4711]: I1203 12:48:24.472737 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:24 crc kubenswrapper[4711]: I1203 12:48:24.484481 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:24 crc kubenswrapper[4711]: I1203 12:48:24.485987 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:24 crc kubenswrapper[4711]: I1203 12:48:24.486150 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:27 crc kubenswrapper[4711]: I1203 12:48:27.777963 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:27 crc kubenswrapper[4711]: I1203 12:48:27.778363 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:27 crc kubenswrapper[4711]: I1203 12:48:27.778383 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:27 crc kubenswrapper[4711]: I1203 12:48:27.802942 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:27 crc kubenswrapper[4711]: I1203 12:48:27.812868 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:27 crc kubenswrapper[4711]: I1203 12:48:27.831345 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:28 crc kubenswrapper[4711]: I1203 12:48:28.502580 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:28 crc kubenswrapper[4711]: I1203 12:48:28.502628 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:28 crc kubenswrapper[4711]: I1203 12:48:28.502641 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:28 crc kubenswrapper[4711]: I1203 12:48:28.523169 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:28 crc kubenswrapper[4711]: I1203 12:48:28.527501 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:28 crc kubenswrapper[4711]: I1203 12:48:28.527720 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.004942 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.006784 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.014603 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.016274 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.030987 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.045020 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.147991 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.149404 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.153582 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.155031 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.160849 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172096 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172145 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172326 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172431 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172468 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172499 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172534 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-scripts\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172559 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-sys\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172589 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8fmx\" (UniqueName: \"kubernetes.io/projected/02e283b6-2f8f-47e8-b068-7fa6be429de7-kube-api-access-l8fmx\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172642 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-logs\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172686 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172719 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-scripts\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172742 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-run\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172762 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-dev\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172781 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172805 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-sys\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172871 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-config-data\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172967 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.172994 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-run\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173016 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kzvd\" (UniqueName: \"kubernetes.io/projected/ba2b9a61-4f13-4911-9977-157cbba6e185-kube-api-access-7kzvd\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173035 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173105 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173137 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173161 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173190 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173217 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-logs\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173265 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-config-data\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.173316 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-dev\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.178067 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274619 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-config-data\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274675 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-logs\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274699 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-config-data\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274744 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274765 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274821 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274837 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-dev\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274855 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8c6v\" (UniqueName: \"kubernetes.io/projected/c7f00329-7067-49ec-84fa-cf2ab279955a-kube-api-access-w8c6v\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274874 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274889 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-sys\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274919 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-run\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274936 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-logs\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274951 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-scripts\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274967 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274971 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.274982 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275014 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275088 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275174 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275196 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275216 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275232 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275249 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275280 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275307 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-scripts\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275330 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-sys\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275352 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8fmx\" (UniqueName: \"kubernetes.io/projected/02e283b6-2f8f-47e8-b068-7fa6be429de7-kube-api-access-l8fmx\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275373 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-logs\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275432 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275475 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-sys\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275493 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275517 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-scripts\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275532 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-run\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275557 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-dev\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275561 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275573 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275595 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-sys\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275620 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275639 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275658 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275684 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-config-data\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275703 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275719 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-run\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275757 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275773 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-run\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275793 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275815 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kzvd\" (UniqueName: \"kubernetes.io/projected/ba2b9a61-4f13-4911-9977-157cbba6e185-kube-api-access-7kzvd\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275862 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275936 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275965 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-sys\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.275980 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-scripts\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276226 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276224 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276403 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-run\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276446 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-dev\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276494 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276545 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-run\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276589 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-dev\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276558 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-sys\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276586 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276592 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276568 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276676 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.276681 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.277201 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-logs\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.277210 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278467 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278513 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278560 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278598 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-logs\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278627 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-config-data\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278656 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-dev\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278710 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8csqj\" (UniqueName: \"kubernetes.io/projected/c6fa5593-a616-432d-a4cc-1b787ffc516a-kube-api-access-8csqj\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278931 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.278986 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.279195 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-dev\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.280209 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-logs\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.281765 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.282457 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-scripts\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.282866 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-config-data\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.284500 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-config-data\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.288527 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-scripts\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.292482 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8fmx\" (UniqueName: \"kubernetes.io/projected/02e283b6-2f8f-47e8-b068-7fa6be429de7-kube-api-access-l8fmx\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.295051 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kzvd\" (UniqueName: \"kubernetes.io/projected/ba2b9a61-4f13-4911-9977-157cbba6e185-kube-api-access-7kzvd\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.299075 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.301108 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.303621 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-1\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.322650 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-2\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.341830 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.357128 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380286 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8csqj\" (UniqueName: \"kubernetes.io/projected/c6fa5593-a616-432d-a4cc-1b787ffc516a-kube-api-access-8csqj\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380366 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-config-data\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380391 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-logs\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380414 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-config-data\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380437 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380457 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-dev\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380479 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8c6v\" (UniqueName: \"kubernetes.io/projected/c7f00329-7067-49ec-84fa-cf2ab279955a-kube-api-access-w8c6v\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380502 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380521 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-sys\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380541 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-run\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380559 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-logs\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380578 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-scripts\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380596 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380632 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380656 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380678 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380711 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-sys\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380732 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380761 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380784 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380807 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380833 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380853 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-run\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380892 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380952 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.380992 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-scripts\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.382761 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-dev\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.382825 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381063 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-sys\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381103 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381128 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381005 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.383144 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-dev\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381202 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381434 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-run\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381426 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381464 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381483 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381498 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381525 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-dev\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381551 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381546 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-logs\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381590 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381615 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-logs\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381663 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381992 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381179 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-run\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381033 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.383431 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.381157 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-sys\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.387025 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-config-data\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.388532 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-scripts\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.395884 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-scripts\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.397589 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-config-data\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.397697 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8csqj\" (UniqueName: \"kubernetes.io/projected/c6fa5593-a616-432d-a4cc-1b787ffc516a-kube-api-access-8csqj\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.399224 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8c6v\" (UniqueName: \"kubernetes.io/projected/c7f00329-7067-49ec-84fa-cf2ab279955a-kube-api-access-w8c6v\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.409795 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.413211 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.415351 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-2\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.426663 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-1\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.469672 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.482222 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.776700 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:48:31 crc kubenswrapper[4711]: W1203 12:48:31.778630 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e283b6_2f8f_47e8_b068_7fa6be429de7.slice/crio-91b0f8810b6605dc592fb5159a3b6813a5a8f4bccf48ab20be6e8f785c529937 WatchSource:0}: Error finding container 91b0f8810b6605dc592fb5159a3b6813a5a8f4bccf48ab20be6e8f785c529937: Status 404 returned error can't find the container with id 91b0f8810b6605dc592fb5159a3b6813a5a8f4bccf48ab20be6e8f785c529937 Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.849232 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:48:31 crc kubenswrapper[4711]: W1203 12:48:31.859148 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba2b9a61_4f13_4911_9977_157cbba6e185.slice/crio-528c60d5321aa219373035dcef0cbded70ad5c7c777ae04941bd584f0bfaea0d WatchSource:0}: Error finding container 528c60d5321aa219373035dcef0cbded70ad5c7c777ae04941bd584f0bfaea0d: Status 404 returned error can't find the container with id 528c60d5321aa219373035dcef0cbded70ad5c7c777ae04941bd584f0bfaea0d Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.937725 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:48:31 crc kubenswrapper[4711]: W1203 12:48:31.945083 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6fa5593_a616_432d_a4cc_1b787ffc516a.slice/crio-d2d5a9f3f3efafd60e612534e4a7bb8aed51b86b15cbae4491de9648241f307c WatchSource:0}: Error finding container d2d5a9f3f3efafd60e612534e4a7bb8aed51b86b15cbae4491de9648241f307c: Status 404 returned error can't find the container with id d2d5a9f3f3efafd60e612534e4a7bb8aed51b86b15cbae4491de9648241f307c Dec 03 12:48:31 crc kubenswrapper[4711]: I1203 12:48:31.948348 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.550445 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"ba2b9a61-4f13-4911-9977-157cbba6e185","Type":"ContainerStarted","Data":"fffc4ebe132c85ec3aaa416c261b3f0292756c1a10503cb6a8612d2db95534fb"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.551240 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"ba2b9a61-4f13-4911-9977-157cbba6e185","Type":"ContainerStarted","Data":"5b0da23dabd2f8ce69e2a5e24abd4534674e2a3929a0c954dba80073fe4c41f3"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.551261 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"ba2b9a61-4f13-4911-9977-157cbba6e185","Type":"ContainerStarted","Data":"b1bb0289fe3cf9f334488d1b3bcf9bdf2eee306277df6dddc94b8e949cda0aa2"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.551274 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"ba2b9a61-4f13-4911-9977-157cbba6e185","Type":"ContainerStarted","Data":"528c60d5321aa219373035dcef0cbded70ad5c7c777ae04941bd584f0bfaea0d"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.553873 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"02e283b6-2f8f-47e8-b068-7fa6be429de7","Type":"ContainerStarted","Data":"5779ae7d2bea7fbfe718d0c56ef8c738eca22f44efd27d43842d75352d2a624a"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.553899 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"02e283b6-2f8f-47e8-b068-7fa6be429de7","Type":"ContainerStarted","Data":"e91ed50ef9999e6bd076b05c8b5d470fc5b3367f03774ca54f7dbc615cc73e1a"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.553926 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"02e283b6-2f8f-47e8-b068-7fa6be429de7","Type":"ContainerStarted","Data":"49bfc8d3a2cef9743231e0a2505435d5711b57028c2cd642ec6dec59ec1e88e4"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.553940 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"02e283b6-2f8f-47e8-b068-7fa6be429de7","Type":"ContainerStarted","Data":"91b0f8810b6605dc592fb5159a3b6813a5a8f4bccf48ab20be6e8f785c529937"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.555704 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"c7f00329-7067-49ec-84fa-cf2ab279955a","Type":"ContainerStarted","Data":"ca4d2b57614a2085c63edb6cf6fa28d86e67cbe2daabaa6657b97dd01bd9bdb9"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.555736 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"c7f00329-7067-49ec-84fa-cf2ab279955a","Type":"ContainerStarted","Data":"275aaa69eb5b557ef1aa2b7e839335f4c14592fbc5d943dbd20fcb2710f0d58a"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.555745 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"c7f00329-7067-49ec-84fa-cf2ab279955a","Type":"ContainerStarted","Data":"bfcd9e71bc8343f7c746b498f95afa65a37e4e6ffa8d2a840c7215cb895fe374"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.557397 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"c6fa5593-a616-432d-a4cc-1b787ffc516a","Type":"ContainerStarted","Data":"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.557417 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"c6fa5593-a616-432d-a4cc-1b787ffc516a","Type":"ContainerStarted","Data":"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.557430 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"c6fa5593-a616-432d-a4cc-1b787ffc516a","Type":"ContainerStarted","Data":"d2d5a9f3f3efafd60e612534e4a7bb8aed51b86b15cbae4491de9648241f307c"} Dec 03 12:48:32 crc kubenswrapper[4711]: I1203 12:48:32.581090 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.5810726170000002 podStartE2EDuration="3.581072617s" podCreationTimestamp="2025-12-03 12:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:32.578295411 +0000 UTC m=+2031.247546676" watchObservedRunningTime="2025-12-03 12:48:32.581072617 +0000 UTC m=+2031.250323872" Dec 03 12:48:33 crc kubenswrapper[4711]: I1203 12:48:33.574601 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"c7f00329-7067-49ec-84fa-cf2ab279955a","Type":"ContainerStarted","Data":"a7514dd12055073cceda3671a612e1900a6e45254c7627587a01cd9d073e7c56"} Dec 03 12:48:33 crc kubenswrapper[4711]: I1203 12:48:33.580935 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"c6fa5593-a616-432d-a4cc-1b787ffc516a","Type":"ContainerStarted","Data":"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7"} Dec 03 12:48:33 crc kubenswrapper[4711]: I1203 12:48:33.613103 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.61306768 podStartE2EDuration="3.61306768s" podCreationTimestamp="2025-12-03 12:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:33.604356279 +0000 UTC m=+2032.273607554" watchObservedRunningTime="2025-12-03 12:48:33.61306768 +0000 UTC m=+2032.282318935" Dec 03 12:48:33 crc kubenswrapper[4711]: I1203 12:48:33.614172 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=4.614165419 podStartE2EDuration="4.614165419s" podCreationTimestamp="2025-12-03 12:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:32.618540047 +0000 UTC m=+2031.287791312" watchObservedRunningTime="2025-12-03 12:48:33.614165419 +0000 UTC m=+2032.283416674" Dec 03 12:48:33 crc kubenswrapper[4711]: I1203 12:48:33.635197 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.635176016 podStartE2EDuration="3.635176016s" podCreationTimestamp="2025-12-03 12:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:33.629546821 +0000 UTC m=+2032.298798116" watchObservedRunningTime="2025-12-03 12:48:33.635176016 +0000 UTC m=+2032.304427271" Dec 03 12:48:35 crc kubenswrapper[4711]: I1203 12:48:35.401511 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:48:35 crc kubenswrapper[4711]: I1203 12:48:35.401919 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.343024 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.343558 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.343582 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.358150 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.358192 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.358202 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.374204 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.386528 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.388877 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.393585 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.394130 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.434776 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.470240 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.470346 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.470369 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.482922 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.483077 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.483142 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.501717 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.510086 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.529035 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.531980 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.535424 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.540482 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.655614 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.655672 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656584 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656620 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656635 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656648 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656664 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656678 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656689 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656702 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656719 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.656731 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.672732 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.672879 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.673805 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.673969 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.674844 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.676928 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.676993 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.677452 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.679768 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.681442 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.683889 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:41 crc kubenswrapper[4711]: I1203 12:48:41.685991 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:42 crc kubenswrapper[4711]: I1203 12:48:42.713886 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:48:42 crc kubenswrapper[4711]: I1203 12:48:42.736829 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:48:43 crc kubenswrapper[4711]: I1203 12:48:43.026645 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:48:43 crc kubenswrapper[4711]: I1203 12:48:43.045347 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.689080 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-log" containerID="cri-o://b1bb0289fe3cf9f334488d1b3bcf9bdf2eee306277df6dddc94b8e949cda0aa2" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.689565 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-log" containerID="cri-o://3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.689226 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-httpd" containerID="cri-o://5b0da23dabd2f8ce69e2a5e24abd4534674e2a3929a0c954dba80073fe4c41f3" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.689187 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-api" containerID="cri-o://fffc4ebe132c85ec3aaa416c261b3f0292756c1a10503cb6a8612d2db95534fb" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.689759 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-api" containerID="cri-o://73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.689798 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-log" containerID="cri-o://49bfc8d3a2cef9743231e0a2505435d5711b57028c2cd642ec6dec59ec1e88e4" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.689986 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-httpd" containerID="cri-o://e91ed50ef9999e6bd076b05c8b5d470fc5b3367f03774ca54f7dbc615cc73e1a" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.689816 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-httpd" containerID="cri-o://c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.690025 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-api" containerID="cri-o://5779ae7d2bea7fbfe718d0c56ef8c738eca22f44efd27d43842d75352d2a624a" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.690070 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-log" containerID="cri-o://275aaa69eb5b557ef1aa2b7e839335f4c14592fbc5d943dbd20fcb2710f0d58a" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.690126 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-api" containerID="cri-o://a7514dd12055073cceda3671a612e1900a6e45254c7627587a01cd9d073e7c56" gracePeriod=30 Dec 03 12:48:44 crc kubenswrapper[4711]: I1203 12:48:44.690181 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-httpd" containerID="cri-o://ca4d2b57614a2085c63edb6cf6fa28d86e67cbe2daabaa6657b97dd01bd9bdb9" gracePeriod=30 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.622475 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698047 4711 generic.go:334] "Generic (PLEG): container finished" podID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerID="73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698083 4711 generic.go:334] "Generic (PLEG): container finished" podID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerID="c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698094 4711 generic.go:334] "Generic (PLEG): container finished" podID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerID="3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21" exitCode=143 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698137 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"c6fa5593-a616-432d-a4cc-1b787ffc516a","Type":"ContainerDied","Data":"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698170 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"c6fa5593-a616-432d-a4cc-1b787ffc516a","Type":"ContainerDied","Data":"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698185 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"c6fa5593-a616-432d-a4cc-1b787ffc516a","Type":"ContainerDied","Data":"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698199 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"c6fa5593-a616-432d-a4cc-1b787ffc516a","Type":"ContainerDied","Data":"d2d5a9f3f3efafd60e612534e4a7bb8aed51b86b15cbae4491de9648241f307c"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698226 4711 scope.go:117] "RemoveContainer" containerID="73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.698396 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.703847 4711 generic.go:334] "Generic (PLEG): container finished" podID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerID="fffc4ebe132c85ec3aaa416c261b3f0292756c1a10503cb6a8612d2db95534fb" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.703931 4711 generic.go:334] "Generic (PLEG): container finished" podID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerID="5b0da23dabd2f8ce69e2a5e24abd4534674e2a3929a0c954dba80073fe4c41f3" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.703947 4711 generic.go:334] "Generic (PLEG): container finished" podID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerID="b1bb0289fe3cf9f334488d1b3bcf9bdf2eee306277df6dddc94b8e949cda0aa2" exitCode=143 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.703947 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"ba2b9a61-4f13-4911-9977-157cbba6e185","Type":"ContainerDied","Data":"fffc4ebe132c85ec3aaa416c261b3f0292756c1a10503cb6a8612d2db95534fb"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.703986 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"ba2b9a61-4f13-4911-9977-157cbba6e185","Type":"ContainerDied","Data":"5b0da23dabd2f8ce69e2a5e24abd4534674e2a3929a0c954dba80073fe4c41f3"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.703999 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"ba2b9a61-4f13-4911-9977-157cbba6e185","Type":"ContainerDied","Data":"b1bb0289fe3cf9f334488d1b3bcf9bdf2eee306277df6dddc94b8e949cda0aa2"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.715435 4711 generic.go:334] "Generic (PLEG): container finished" podID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerID="5779ae7d2bea7fbfe718d0c56ef8c738eca22f44efd27d43842d75352d2a624a" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.715487 4711 generic.go:334] "Generic (PLEG): container finished" podID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerID="e91ed50ef9999e6bd076b05c8b5d470fc5b3367f03774ca54f7dbc615cc73e1a" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.715502 4711 generic.go:334] "Generic (PLEG): container finished" podID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerID="49bfc8d3a2cef9743231e0a2505435d5711b57028c2cd642ec6dec59ec1e88e4" exitCode=143 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.715515 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"02e283b6-2f8f-47e8-b068-7fa6be429de7","Type":"ContainerDied","Data":"5779ae7d2bea7fbfe718d0c56ef8c738eca22f44efd27d43842d75352d2a624a"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.715577 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"02e283b6-2f8f-47e8-b068-7fa6be429de7","Type":"ContainerDied","Data":"e91ed50ef9999e6bd076b05c8b5d470fc5b3367f03774ca54f7dbc615cc73e1a"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.715593 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"02e283b6-2f8f-47e8-b068-7fa6be429de7","Type":"ContainerDied","Data":"49bfc8d3a2cef9743231e0a2505435d5711b57028c2cd642ec6dec59ec1e88e4"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.719428 4711 generic.go:334] "Generic (PLEG): container finished" podID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerID="a7514dd12055073cceda3671a612e1900a6e45254c7627587a01cd9d073e7c56" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.719517 4711 generic.go:334] "Generic (PLEG): container finished" podID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerID="ca4d2b57614a2085c63edb6cf6fa28d86e67cbe2daabaa6657b97dd01bd9bdb9" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.719539 4711 generic.go:334] "Generic (PLEG): container finished" podID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerID="275aaa69eb5b557ef1aa2b7e839335f4c14592fbc5d943dbd20fcb2710f0d58a" exitCode=143 Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.719461 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"c7f00329-7067-49ec-84fa-cf2ab279955a","Type":"ContainerDied","Data":"a7514dd12055073cceda3671a612e1900a6e45254c7627587a01cd9d073e7c56"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.719644 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"c7f00329-7067-49ec-84fa-cf2ab279955a","Type":"ContainerDied","Data":"ca4d2b57614a2085c63edb6cf6fa28d86e67cbe2daabaa6657b97dd01bd9bdb9"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.719719 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"c7f00329-7067-49ec-84fa-cf2ab279955a","Type":"ContainerDied","Data":"275aaa69eb5b557ef1aa2b7e839335f4c14592fbc5d943dbd20fcb2710f0d58a"} Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.737066 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.744964 4711 scope.go:117] "RemoveContainer" containerID="c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.788129 4711 scope.go:117] "RemoveContainer" containerID="3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789123 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-logs\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789206 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8csqj\" (UniqueName: \"kubernetes.io/projected/c6fa5593-a616-432d-a4cc-1b787ffc516a-kube-api-access-8csqj\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789236 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-scripts\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789282 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-iscsi\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789311 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789337 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-sys\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789389 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789454 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-lib-modules\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789501 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-dev\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789520 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-nvme\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789555 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-config-data\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789588 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-run\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789614 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-httpd-run\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789619 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-logs" (OuterVolumeSpecName: "logs") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789689 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-var-locks-brick\") pod \"c6fa5593-a616-432d-a4cc-1b787ffc516a\" (UID: \"c6fa5593-a616-432d-a4cc-1b787ffc516a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789810 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789848 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.789957 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.790016 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-sys" (OuterVolumeSpecName: "sys") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.790061 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.790095 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-run" (OuterVolumeSpecName: "run") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.790237 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-dev" (OuterVolumeSpecName: "dev") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.790564 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.791753 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.791842 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.791895 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.792032 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.792108 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.792167 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.792218 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c6fa5593-a616-432d-a4cc-1b787ffc516a-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.793299 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.794454 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.795058 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-scripts" (OuterVolumeSpecName: "scripts") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.795075 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.796392 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.802648 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6fa5593-a616-432d-a4cc-1b787ffc516a-kube-api-access-8csqj" (OuterVolumeSpecName: "kube-api-access-8csqj") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "kube-api-access-8csqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.808170 4711 scope.go:117] "RemoveContainer" containerID="73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7" Dec 03 12:48:45 crc kubenswrapper[4711]: E1203 12:48:45.808759 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7\": container with ID starting with 73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7 not found: ID does not exist" containerID="73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.808864 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7"} err="failed to get container status \"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7\": rpc error: code = NotFound desc = could not find container \"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7\": container with ID starting with 73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7 not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.808995 4711 scope.go:117] "RemoveContainer" containerID="c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf" Dec 03 12:48:45 crc kubenswrapper[4711]: E1203 12:48:45.809534 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf\": container with ID starting with c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf not found: ID does not exist" containerID="c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.809623 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf"} err="failed to get container status \"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf\": rpc error: code = NotFound desc = could not find container \"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf\": container with ID starting with c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.809707 4711 scope.go:117] "RemoveContainer" containerID="3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21" Dec 03 12:48:45 crc kubenswrapper[4711]: E1203 12:48:45.810081 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21\": container with ID starting with 3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21 not found: ID does not exist" containerID="3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.810213 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21"} err="failed to get container status \"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21\": rpc error: code = NotFound desc = could not find container \"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21\": container with ID starting with 3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21 not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.810316 4711 scope.go:117] "RemoveContainer" containerID="73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.810446 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.810714 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7"} err="failed to get container status \"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7\": rpc error: code = NotFound desc = could not find container \"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7\": container with ID starting with 73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7 not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.810805 4711 scope.go:117] "RemoveContainer" containerID="c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.811109 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf"} err="failed to get container status \"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf\": rpc error: code = NotFound desc = could not find container \"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf\": container with ID starting with c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.811136 4711 scope.go:117] "RemoveContainer" containerID="3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.811314 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21"} err="failed to get container status \"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21\": rpc error: code = NotFound desc = could not find container \"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21\": container with ID starting with 3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21 not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.811397 4711 scope.go:117] "RemoveContainer" containerID="73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.811636 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7"} err="failed to get container status \"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7\": rpc error: code = NotFound desc = could not find container \"73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7\": container with ID starting with 73d1f60c74c8b4b58a872e2318360d07fbb26ff9b63b7c6753bc6c3d6d9516a7 not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.811702 4711 scope.go:117] "RemoveContainer" containerID="c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.811897 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf"} err="failed to get container status \"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf\": rpc error: code = NotFound desc = could not find container \"c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf\": container with ID starting with c0c2ff7c72583059f305c9eaeb15115714391c40a04c80adf6e979abe2c0ddaf not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.812169 4711 scope.go:117] "RemoveContainer" containerID="3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.812372 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21"} err="failed to get container status \"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21\": rpc error: code = NotFound desc = could not find container \"3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21\": container with ID starting with 3ef876c6a9245b39dfdc3804ac53a519203ff9fa1a7edff3d0c7499efe252e21 not found: ID does not exist" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.888204 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-config-data" (OuterVolumeSpecName: "config-data") pod "c6fa5593-a616-432d-a4cc-1b787ffc516a" (UID: "c6fa5593-a616-432d-a4cc-1b787ffc516a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.892536 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-config-data\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.892686 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-lib-modules\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.892808 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.892900 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893060 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-iscsi\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893157 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-run\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893264 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-logs\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893365 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-sys\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893156 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893508 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-nvme\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893615 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893730 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-lib-modules\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893891 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-nvme\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893993 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-iscsi\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.894074 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-iscsi\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.894161 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-scripts\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.894250 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-sys\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.896941 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-httpd-run\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.897122 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-var-locks-brick\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.897212 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-dev\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.897319 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-lib-modules\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.897415 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.897509 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-config-data\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893206 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-run" (OuterVolumeSpecName: "run") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893543 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-sys" (OuterVolumeSpecName: "sys") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893570 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893591 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-logs" (OuterVolumeSpecName: "logs") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.893856 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.894341 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.894360 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.894375 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.894395 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-sys" (OuterVolumeSpecName: "sys") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.895736 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.897632 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.897734 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-dev" (OuterVolumeSpecName: "dev") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.897831 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.898116 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.898130 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.898230 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-scripts" (OuterVolumeSpecName: "scripts") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.898377 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-var-locks-brick\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.898539 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.898675 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-run\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.898955 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-var-locks-brick\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.899326 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8c6v\" (UniqueName: \"kubernetes.io/projected/c7f00329-7067-49ec-84fa-cf2ab279955a-kube-api-access-w8c6v\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.899459 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-scripts\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.898491 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.899034 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-run" (OuterVolumeSpecName: "run") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.899068 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.899995 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.900106 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-dev" (OuterVolumeSpecName: "dev") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.899943 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-dev\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.900342 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8fmx\" (UniqueName: \"kubernetes.io/projected/02e283b6-2f8f-47e8-b068-7fa6be429de7-kube-api-access-l8fmx\") pod \"02e283b6-2f8f-47e8-b068-7fa6be429de7\" (UID: \"02e283b6-2f8f-47e8-b068-7fa6be429de7\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.900445 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-logs\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.900752 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-config-data\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.900871 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-sys\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.901351 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-httpd-run\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.901587 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.901692 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-dev\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.901794 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-scripts\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.901876 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-nvme\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.901997 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-logs\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.902681 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-run\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.902794 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.903188 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kzvd\" (UniqueName: \"kubernetes.io/projected/ba2b9a61-4f13-4911-9977-157cbba6e185-kube-api-access-7kzvd\") pod \"ba2b9a61-4f13-4911-9977-157cbba6e185\" (UID: \"ba2b9a61-4f13-4911-9977-157cbba6e185\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.903304 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-httpd-run\") pod \"c7f00329-7067-49ec-84fa-cf2ab279955a\" (UID: \"c7f00329-7067-49ec-84fa-cf2ab279955a\") " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915301 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915339 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6fa5593-a616-432d-a4cc-1b787ffc516a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915365 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915402 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915447 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915469 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915538 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915549 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915564 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915590 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915603 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915616 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915632 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915648 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915660 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915675 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915688 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02e283b6-2f8f-47e8-b068-7fa6be429de7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915701 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915713 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915728 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915749 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915763 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915781 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915794 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8csqj\" (UniqueName: \"kubernetes.io/projected/c6fa5593-a616-432d-a4cc-1b787ffc516a-kube-api-access-8csqj\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915807 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6fa5593-a616-432d-a4cc-1b787ffc516a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915819 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02e283b6-2f8f-47e8-b068-7fa6be429de7-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915842 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915854 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.915877 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.900988 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-logs" (OuterVolumeSpecName: "logs") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.900991 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-sys" (OuterVolumeSpecName: "sys") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.902264 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-scripts" (OuterVolumeSpecName: "scripts") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.902299 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-dev" (OuterVolumeSpecName: "dev") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.902340 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.917923 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-run" (OuterVolumeSpecName: "run") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.902638 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-logs" (OuterVolumeSpecName: "logs") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.913308 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.913355 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance-cache") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.913443 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f00329-7067-49ec-84fa-cf2ab279955a-kube-api-access-w8c6v" (OuterVolumeSpecName: "kube-api-access-w8c6v") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "kube-api-access-w8c6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.917313 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e283b6-2f8f-47e8-b068-7fa6be429de7-kube-api-access-l8fmx" (OuterVolumeSpecName: "kube-api-access-l8fmx") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "kube-api-access-l8fmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.917781 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-scripts" (OuterVolumeSpecName: "scripts") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.918330 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.924953 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.927623 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.945509 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2b9a61-4f13-4911-9977-157cbba6e185-kube-api-access-7kzvd" (OuterVolumeSpecName: "kube-api-access-7kzvd") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "kube-api-access-7kzvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.947304 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.948410 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.948778 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.954627 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.957473 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.981245 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-config-data" (OuterVolumeSpecName: "config-data") pod "02e283b6-2f8f-47e8-b068-7fa6be429de7" (UID: "02e283b6-2f8f-47e8-b068-7fa6be429de7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.986339 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-config-data" (OuterVolumeSpecName: "config-data") pod "ba2b9a61-4f13-4911-9977-157cbba6e185" (UID: "ba2b9a61-4f13-4911-9977-157cbba6e185"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:45 crc kubenswrapper[4711]: I1203 12:48:45.996279 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-config-data" (OuterVolumeSpecName: "config-data") pod "c7f00329-7067-49ec-84fa-cf2ab279955a" (UID: "c7f00329-7067-49ec-84fa-cf2ab279955a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016821 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016878 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016895 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kzvd\" (UniqueName: \"kubernetes.io/projected/ba2b9a61-4f13-4911-9977-157cbba6e185-kube-api-access-7kzvd\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016923 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016937 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016948 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016959 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016969 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016980 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b9a61-4f13-4911-9977-157cbba6e185-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.016998 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017010 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017023 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8c6v\" (UniqueName: \"kubernetes.io/projected/c7f00329-7067-49ec-84fa-cf2ab279955a-kube-api-access-w8c6v\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017033 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e283b6-2f8f-47e8-b068-7fa6be429de7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017044 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8fmx\" (UniqueName: \"kubernetes.io/projected/02e283b6-2f8f-47e8-b068-7fa6be429de7-kube-api-access-l8fmx\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017054 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017063 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017074 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017084 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017094 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba2b9a61-4f13-4911-9977-157cbba6e185-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017111 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017122 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7f00329-7067-49ec-84fa-cf2ab279955a-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017132 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7f00329-7067-49ec-84fa-cf2ab279955a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017143 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba2b9a61-4f13-4911-9977-157cbba6e185-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.017154 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f00329-7067-49ec-84fa-cf2ab279955a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.034440 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.037283 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.038520 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.039440 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.045864 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.118231 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.118257 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.118267 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.731442 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"ba2b9a61-4f13-4911-9977-157cbba6e185","Type":"ContainerDied","Data":"528c60d5321aa219373035dcef0cbded70ad5c7c777ae04941bd584f0bfaea0d"} Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.731494 4711 scope.go:117] "RemoveContainer" containerID="fffc4ebe132c85ec3aaa416c261b3f0292756c1a10503cb6a8612d2db95534fb" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.731623 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.747112 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"02e283b6-2f8f-47e8-b068-7fa6be429de7","Type":"ContainerDied","Data":"91b0f8810b6605dc592fb5159a3b6813a5a8f4bccf48ab20be6e8f785c529937"} Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.747246 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.796202 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"c7f00329-7067-49ec-84fa-cf2ab279955a","Type":"ContainerDied","Data":"bfcd9e71bc8343f7c746b498f95afa65a37e4e6ffa8d2a840c7215cb895fe374"} Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.796356 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.812435 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.826974 4711 scope.go:117] "RemoveContainer" containerID="5b0da23dabd2f8ce69e2a5e24abd4534674e2a3929a0c954dba80073fe4c41f3" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.832116 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.848461 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.861892 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.883965 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.890190 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.904820 4711 scope.go:117] "RemoveContainer" containerID="b1bb0289fe3cf9f334488d1b3bcf9bdf2eee306277df6dddc94b8e949cda0aa2" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.920536 4711 scope.go:117] "RemoveContainer" containerID="5779ae7d2bea7fbfe718d0c56ef8c738eca22f44efd27d43842d75352d2a624a" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.941219 4711 scope.go:117] "RemoveContainer" containerID="e91ed50ef9999e6bd076b05c8b5d470fc5b3367f03774ca54f7dbc615cc73e1a" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.966494 4711 scope.go:117] "RemoveContainer" containerID="49bfc8d3a2cef9743231e0a2505435d5711b57028c2cd642ec6dec59ec1e88e4" Dec 03 12:48:46 crc kubenswrapper[4711]: I1203 12:48:46.986597 4711 scope.go:117] "RemoveContainer" containerID="a7514dd12055073cceda3671a612e1900a6e45254c7627587a01cd9d073e7c56" Dec 03 12:48:47 crc kubenswrapper[4711]: I1203 12:48:47.010853 4711 scope.go:117] "RemoveContainer" containerID="ca4d2b57614a2085c63edb6cf6fa28d86e67cbe2daabaa6657b97dd01bd9bdb9" Dec 03 12:48:47 crc kubenswrapper[4711]: I1203 12:48:47.043974 4711 scope.go:117] "RemoveContainer" containerID="275aaa69eb5b557ef1aa2b7e839335f4c14592fbc5d943dbd20fcb2710f0d58a" Dec 03 12:48:47 crc kubenswrapper[4711]: I1203 12:48:47.828007 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" path="/var/lib/kubelet/pods/02e283b6-2f8f-47e8-b068-7fa6be429de7/volumes" Dec 03 12:48:47 crc kubenswrapper[4711]: I1203 12:48:47.829379 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" path="/var/lib/kubelet/pods/ba2b9a61-4f13-4911-9977-157cbba6e185/volumes" Dec 03 12:48:47 crc kubenswrapper[4711]: I1203 12:48:47.831178 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" path="/var/lib/kubelet/pods/c6fa5593-a616-432d-a4cc-1b787ffc516a/volumes" Dec 03 12:48:47 crc kubenswrapper[4711]: I1203 12:48:47.833496 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" path="/var/lib/kubelet/pods/c7f00329-7067-49ec-84fa-cf2ab279955a/volumes" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.125581 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.125943 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-log" containerID="cri-o://e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477" gracePeriod=30 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.125987 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-api" containerID="cri-o://a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7" gracePeriod=30 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.126110 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-httpd" containerID="cri-o://c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac" gracePeriod=30 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.490245 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.490655 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-log" containerID="cri-o://878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52" gracePeriod=30 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.490757 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-httpd" containerID="cri-o://008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba" gracePeriod=30 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.490795 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-api" containerID="cri-o://82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226" gracePeriod=30 Dec 03 12:48:48 crc kubenswrapper[4711]: E1203 12:48:48.782499 4711 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f815083_1f43_4c8d_9285_94b2aa696a09.slice/crio-008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f815083_1f43_4c8d_9285_94b2aa696a09.slice/crio-878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.787922 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836504 4711 generic.go:334] "Generic (PLEG): container finished" podID="030389c0-f4a6-4f89-9189-b9c685d44387" containerID="a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7" exitCode=0 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836536 4711 generic.go:334] "Generic (PLEG): container finished" podID="030389c0-f4a6-4f89-9189-b9c685d44387" containerID="c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac" exitCode=0 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836544 4711 generic.go:334] "Generic (PLEG): container finished" podID="030389c0-f4a6-4f89-9189-b9c685d44387" containerID="e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477" exitCode=143 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836596 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836580 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"030389c0-f4a6-4f89-9189-b9c685d44387","Type":"ContainerDied","Data":"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7"} Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836744 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"030389c0-f4a6-4f89-9189-b9c685d44387","Type":"ContainerDied","Data":"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac"} Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836770 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"030389c0-f4a6-4f89-9189-b9c685d44387","Type":"ContainerDied","Data":"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477"} Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836782 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"030389c0-f4a6-4f89-9189-b9c685d44387","Type":"ContainerDied","Data":"2429e2effbb987786d0a8ffd276d64151ed760b4b3b254bf9f46c0ee09dea45b"} Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.836801 4711 scope.go:117] "RemoveContainer" containerID="a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.844062 4711 generic.go:334] "Generic (PLEG): container finished" podID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerID="008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba" exitCode=0 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.844095 4711 generic.go:334] "Generic (PLEG): container finished" podID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerID="878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52" exitCode=143 Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.844121 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7f815083-1f43-4c8d-9285-94b2aa696a09","Type":"ContainerDied","Data":"008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba"} Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.844156 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7f815083-1f43-4c8d-9285-94b2aa696a09","Type":"ContainerDied","Data":"878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52"} Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.863500 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-dev\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.863755 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-scripts\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.863897 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-run\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.863996 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.863601 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-dev" (OuterVolumeSpecName: "dev") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864014 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-run" (OuterVolumeSpecName: "run") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864060 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-nvme\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864202 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-lib-modules\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864226 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-iscsi\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864249 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-var-locks-brick\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864280 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864326 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-logs\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864343 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-sys\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864362 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxwcj\" (UniqueName: \"kubernetes.io/projected/030389c0-f4a6-4f89-9189-b9c685d44387-kube-api-access-mxwcj\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864386 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-config-data\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864424 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-httpd-run\") pod \"030389c0-f4a6-4f89-9189-b9c685d44387\" (UID: \"030389c0-f4a6-4f89-9189-b9c685d44387\") " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864202 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864481 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864602 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864652 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864771 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864787 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864799 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864810 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864821 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864832 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.864976 4711 scope.go:117] "RemoveContainer" containerID="c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.865012 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-logs" (OuterVolumeSpecName: "logs") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.865040 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-sys" (OuterVolumeSpecName: "sys") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.865559 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.870073 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.870115 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-scripts" (OuterVolumeSpecName: "scripts") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.870091 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.870556 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030389c0-f4a6-4f89-9189-b9c685d44387-kube-api-access-mxwcj" (OuterVolumeSpecName: "kube-api-access-mxwcj") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "kube-api-access-mxwcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.886844 4711 scope.go:117] "RemoveContainer" containerID="e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.924403 4711 scope.go:117] "RemoveContainer" containerID="a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7" Dec 03 12:48:48 crc kubenswrapper[4711]: E1203 12:48:48.924770 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7\": container with ID starting with a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7 not found: ID does not exist" containerID="a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.924802 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7"} err="failed to get container status \"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7\": rpc error: code = NotFound desc = could not find container \"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7\": container with ID starting with a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7 not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.924827 4711 scope.go:117] "RemoveContainer" containerID="c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac" Dec 03 12:48:48 crc kubenswrapper[4711]: E1203 12:48:48.925278 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac\": container with ID starting with c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac not found: ID does not exist" containerID="c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.925329 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac"} err="failed to get container status \"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac\": rpc error: code = NotFound desc = could not find container \"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac\": container with ID starting with c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.925356 4711 scope.go:117] "RemoveContainer" containerID="e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477" Dec 03 12:48:48 crc kubenswrapper[4711]: E1203 12:48:48.925677 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477\": container with ID starting with e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477 not found: ID does not exist" containerID="e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.925709 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477"} err="failed to get container status \"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477\": rpc error: code = NotFound desc = could not find container \"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477\": container with ID starting with e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477 not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.925724 4711 scope.go:117] "RemoveContainer" containerID="a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.926043 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7"} err="failed to get container status \"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7\": rpc error: code = NotFound desc = could not find container \"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7\": container with ID starting with a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7 not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.926138 4711 scope.go:117] "RemoveContainer" containerID="c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.926486 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac"} err="failed to get container status \"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac\": rpc error: code = NotFound desc = could not find container \"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac\": container with ID starting with c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.926512 4711 scope.go:117] "RemoveContainer" containerID="e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.926716 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477"} err="failed to get container status \"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477\": rpc error: code = NotFound desc = could not find container \"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477\": container with ID starting with e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477 not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.926749 4711 scope.go:117] "RemoveContainer" containerID="a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.927041 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7"} err="failed to get container status \"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7\": rpc error: code = NotFound desc = could not find container \"a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7\": container with ID starting with a0b90edd47f253dc9e5095c9ff3a57b7ddc4853f7a5bca2d1a83376302f679e7 not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.927062 4711 scope.go:117] "RemoveContainer" containerID="c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.927266 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac"} err="failed to get container status \"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac\": rpc error: code = NotFound desc = could not find container \"c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac\": container with ID starting with c875dd1d2275d6ac7a4dc95ae770b5023a4e9789f00f9b1409a0b0a85472a9ac not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.927285 4711 scope.go:117] "RemoveContainer" containerID="e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.927505 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477"} err="failed to get container status \"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477\": rpc error: code = NotFound desc = could not find container \"e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477\": container with ID starting with e303762a6ba2d47b4bff2cd3d0799ec6077278e982fdb7a37fba873320770477 not found: ID does not exist" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.941313 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-config-data" (OuterVolumeSpecName: "config-data") pod "030389c0-f4a6-4f89-9189-b9c685d44387" (UID: "030389c0-f4a6-4f89-9189-b9c685d44387"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.966506 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.966554 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.966569 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/030389c0-f4a6-4f89-9189-b9c685d44387-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.966580 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxwcj\" (UniqueName: \"kubernetes.io/projected/030389c0-f4a6-4f89-9189-b9c685d44387-kube-api-access-mxwcj\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.966596 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.966606 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/030389c0-f4a6-4f89-9189-b9c685d44387-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.966615 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030389c0-f4a6-4f89-9189-b9c685d44387-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.966631 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.981271 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 12:48:48 crc kubenswrapper[4711]: I1203 12:48:48.983024 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.068580 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.068623 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.136564 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.169571 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8pt6\" (UniqueName: \"kubernetes.io/projected/7f815083-1f43-4c8d-9285-94b2aa696a09-kube-api-access-l8pt6\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.169612 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-scripts\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.169668 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-dev\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170000 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-dev" (OuterVolumeSpecName: "dev") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170193 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-sys\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170251 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-iscsi\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170286 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-logs\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170315 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-lib-modules\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170349 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-var-locks-brick\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170415 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170436 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-run\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170455 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-nvme\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170533 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170555 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-config-data\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.170586 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-httpd-run\") pod \"7f815083-1f43-4c8d-9285-94b2aa696a09\" (UID: \"7f815083-1f43-4c8d-9285-94b2aa696a09\") " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.171217 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.171532 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.171605 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-sys" (OuterVolumeSpecName: "sys") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.171634 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.172065 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-logs" (OuterVolumeSpecName: "logs") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.172103 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.172158 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.172439 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.172601 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-run" (OuterVolumeSpecName: "run") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.173219 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-scripts" (OuterVolumeSpecName: "scripts") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.177894 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.178016 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.181249 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f815083-1f43-4c8d-9285-94b2aa696a09-kube-api-access-l8pt6" (OuterVolumeSpecName: "kube-api-access-l8pt6") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "kube-api-access-l8pt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.213341 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.223981 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.259557 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-config-data" (OuterVolumeSpecName: "config-data") pod "7f815083-1f43-4c8d-9285-94b2aa696a09" (UID: "7f815083-1f43-4c8d-9285-94b2aa696a09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272800 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8pt6\" (UniqueName: \"kubernetes.io/projected/7f815083-1f43-4c8d-9285-94b2aa696a09-kube-api-access-l8pt6\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272825 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272835 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272846 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272854 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272862 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272870 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272896 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272939 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272951 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7f815083-1f43-4c8d-9285-94b2aa696a09-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272968 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272977 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f815083-1f43-4c8d-9285-94b2aa696a09-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.272988 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f815083-1f43-4c8d-9285-94b2aa696a09-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.286795 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.287969 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.374654 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.374944 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.827709 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" path="/var/lib/kubelet/pods/030389c0-f4a6-4f89-9189-b9c685d44387/volumes" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.856942 4711 generic.go:334] "Generic (PLEG): container finished" podID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerID="82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226" exitCode=0 Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.856987 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7f815083-1f43-4c8d-9285-94b2aa696a09","Type":"ContainerDied","Data":"82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226"} Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.857018 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7f815083-1f43-4c8d-9285-94b2aa696a09","Type":"ContainerDied","Data":"ae33bd94df8af2e2d8b570e425544c1be2eb9822b55fd5862ab58a833dc90587"} Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.857039 4711 scope.go:117] "RemoveContainer" containerID="82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.857225 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.879990 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.884498 4711 scope.go:117] "RemoveContainer" containerID="008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.892190 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.912378 4711 scope.go:117] "RemoveContainer" containerID="878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.935054 4711 scope.go:117] "RemoveContainer" containerID="82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226" Dec 03 12:48:49 crc kubenswrapper[4711]: E1203 12:48:49.935725 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226\": container with ID starting with 82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226 not found: ID does not exist" containerID="82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.935794 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226"} err="failed to get container status \"82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226\": rpc error: code = NotFound desc = could not find container \"82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226\": container with ID starting with 82cb7fa23447af324f3004c01d40611811dd8a1f263b189a0f8264976c458226 not found: ID does not exist" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.935838 4711 scope.go:117] "RemoveContainer" containerID="008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba" Dec 03 12:48:49 crc kubenswrapper[4711]: E1203 12:48:49.936407 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba\": container with ID starting with 008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba not found: ID does not exist" containerID="008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.936451 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba"} err="failed to get container status \"008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba\": rpc error: code = NotFound desc = could not find container \"008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba\": container with ID starting with 008c83815d5b196e1144f29523ceb5281c6c112210bb5b3410c529389d6f1bba not found: ID does not exist" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.936478 4711 scope.go:117] "RemoveContainer" containerID="878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52" Dec 03 12:48:49 crc kubenswrapper[4711]: E1203 12:48:49.937046 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52\": container with ID starting with 878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52 not found: ID does not exist" containerID="878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52" Dec 03 12:48:49 crc kubenswrapper[4711]: I1203 12:48:49.937080 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52"} err="failed to get container status \"878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52\": rpc error: code = NotFound desc = could not find container \"878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52\": container with ID starting with 878741c5d39e3bc5b93a2c8c61b058a09d349ccc01fa40f911726e5152857b52 not found: ID does not exist" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.706763 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-rhspb"] Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.727188 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-rhspb"] Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.751834 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancea0cd-account-delete-bhckz"] Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752158 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752179 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752214 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752224 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752245 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752253 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752270 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752279 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752290 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752297 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752316 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752323 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752338 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752345 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752363 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752370 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752386 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752395 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752408 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752417 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752429 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752438 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752453 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752460 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752471 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752478 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752491 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752499 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752509 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752517 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752531 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752539 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752552 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752560 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: E1203 12:48:50.752572 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752579 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752723 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752741 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752753 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752763 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752772 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752783 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752796 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752808 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752821 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752829 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2b9a61-4f13-4911-9977-157cbba6e185" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752838 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fa5593-a616-432d-a4cc-1b787ffc516a" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752846 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e283b6-2f8f-47e8-b068-7fa6be429de7" containerName="glance-httpd" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752857 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752870 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752879 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752893 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f00329-7067-49ec-84fa-cf2ab279955a" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752904 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="030389c0-f4a6-4f89-9189-b9c685d44387" containerName="glance-log" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.752976 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" containerName="glance-api" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.753495 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.758686 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancea0cd-account-delete-bhckz"] Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.796417 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbmgs\" (UniqueName: \"kubernetes.io/projected/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-kube-api-access-bbmgs\") pod \"glancea0cd-account-delete-bhckz\" (UID: \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\") " pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.796557 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-operator-scripts\") pod \"glancea0cd-account-delete-bhckz\" (UID: \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\") " pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.897654 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbmgs\" (UniqueName: \"kubernetes.io/projected/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-kube-api-access-bbmgs\") pod \"glancea0cd-account-delete-bhckz\" (UID: \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\") " pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.897759 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-operator-scripts\") pod \"glancea0cd-account-delete-bhckz\" (UID: \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\") " pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.898524 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-operator-scripts\") pod \"glancea0cd-account-delete-bhckz\" (UID: \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\") " pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:50 crc kubenswrapper[4711]: I1203 12:48:50.917191 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbmgs\" (UniqueName: \"kubernetes.io/projected/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-kube-api-access-bbmgs\") pod \"glancea0cd-account-delete-bhckz\" (UID: \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\") " pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:51 crc kubenswrapper[4711]: I1203 12:48:51.067245 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:51 crc kubenswrapper[4711]: I1203 12:48:51.415978 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancea0cd-account-delete-bhckz"] Dec 03 12:48:51 crc kubenswrapper[4711]: W1203 12:48:51.432758 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f5d825d_4bc4_4387_bcc7_3166a7524a8b.slice/crio-4ff3db7b49e196e0cdae0959879011d05cc53ce6e17fde64d9d0c15b49ad9ec2 WatchSource:0}: Error finding container 4ff3db7b49e196e0cdae0959879011d05cc53ce6e17fde64d9d0c15b49ad9ec2: Status 404 returned error can't find the container with id 4ff3db7b49e196e0cdae0959879011d05cc53ce6e17fde64d9d0c15b49ad9ec2 Dec 03 12:48:51 crc kubenswrapper[4711]: I1203 12:48:51.825975 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f815083-1f43-4c8d-9285-94b2aa696a09" path="/var/lib/kubelet/pods/7f815083-1f43-4c8d-9285-94b2aa696a09/volumes" Dec 03 12:48:51 crc kubenswrapper[4711]: I1203 12:48:51.826958 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20e724b-0cd7-4c5a-813d-d814e2655a03" path="/var/lib/kubelet/pods/e20e724b-0cd7-4c5a-813d-d814e2655a03/volumes" Dec 03 12:48:51 crc kubenswrapper[4711]: I1203 12:48:51.873260 4711 generic.go:334] "Generic (PLEG): container finished" podID="5f5d825d-4bc4-4387-bcc7-3166a7524a8b" containerID="bc7026f116c090900df51fc64f1df0aebfc142a0937d53ed5eeb39d2fc4ace8c" exitCode=0 Dec 03 12:48:51 crc kubenswrapper[4711]: I1203 12:48:51.873297 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" event={"ID":"5f5d825d-4bc4-4387-bcc7-3166a7524a8b","Type":"ContainerDied","Data":"bc7026f116c090900df51fc64f1df0aebfc142a0937d53ed5eeb39d2fc4ace8c"} Dec 03 12:48:51 crc kubenswrapper[4711]: I1203 12:48:51.873324 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" event={"ID":"5f5d825d-4bc4-4387-bcc7-3166a7524a8b","Type":"ContainerStarted","Data":"4ff3db7b49e196e0cdae0959879011d05cc53ce6e17fde64d9d0c15b49ad9ec2"} Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.177575 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.234598 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-operator-scripts\") pod \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\" (UID: \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\") " Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.234652 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbmgs\" (UniqueName: \"kubernetes.io/projected/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-kube-api-access-bbmgs\") pod \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\" (UID: \"5f5d825d-4bc4-4387-bcc7-3166a7524a8b\") " Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.235335 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f5d825d-4bc4-4387-bcc7-3166a7524a8b" (UID: "5f5d825d-4bc4-4387-bcc7-3166a7524a8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.241307 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-kube-api-access-bbmgs" (OuterVolumeSpecName: "kube-api-access-bbmgs") pod "5f5d825d-4bc4-4387-bcc7-3166a7524a8b" (UID: "5f5d825d-4bc4-4387-bcc7-3166a7524a8b"). InnerVolumeSpecName "kube-api-access-bbmgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.336585 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.336620 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbmgs\" (UniqueName: \"kubernetes.io/projected/5f5d825d-4bc4-4387-bcc7-3166a7524a8b-kube-api-access-bbmgs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.888580 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" event={"ID":"5f5d825d-4bc4-4387-bcc7-3166a7524a8b","Type":"ContainerDied","Data":"4ff3db7b49e196e0cdae0959879011d05cc53ce6e17fde64d9d0c15b49ad9ec2"} Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.888615 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea0cd-account-delete-bhckz" Dec 03 12:48:53 crc kubenswrapper[4711]: I1203 12:48:53.888631 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ff3db7b49e196e0cdae0959879011d05cc53ce6e17fde64d9d0c15b49ad9ec2" Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.783067 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-kv5n2"] Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.791096 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2"] Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.798175 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-a0cd-account-create-update-wn2w2"] Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.803553 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancea0cd-account-delete-bhckz"] Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.808668 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-kv5n2"] Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.813737 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancea0cd-account-delete-bhckz"] Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.826480 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f5d825d-4bc4-4387-bcc7-3166a7524a8b" path="/var/lib/kubelet/pods/5f5d825d-4bc4-4387-bcc7-3166a7524a8b/volumes" Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.827604 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867531ae-9b99-4428-a807-54069dfb9da9" path="/var/lib/kubelet/pods/867531ae-9b99-4428-a807-54069dfb9da9/volumes" Dec 03 12:48:55 crc kubenswrapper[4711]: I1203 12:48:55.828413 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea294a21-cc69-4aa6-aa5e-51b4f7c86b68" path="/var/lib/kubelet/pods/ea294a21-cc69-4aa6-aa5e-51b4f7c86b68/volumes" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.654974 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-ks97b"] Dec 03 12:48:56 crc kubenswrapper[4711]: E1203 12:48:56.655393 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5d825d-4bc4-4387-bcc7-3166a7524a8b" containerName="mariadb-account-delete" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.655417 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5d825d-4bc4-4387-bcc7-3166a7524a8b" containerName="mariadb-account-delete" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.655657 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f5d825d-4bc4-4387-bcc7-3166a7524a8b" containerName="mariadb-account-delete" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.656411 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.667571 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-be2b-account-create-update-fw7h5"] Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.668398 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.669549 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.678339 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-ks97b"] Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.684628 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-be2b-account-create-update-fw7h5"] Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.689130 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-operator-scripts\") pod \"glance-db-create-ks97b\" (UID: \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\") " pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.689238 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2gd\" (UniqueName: \"kubernetes.io/projected/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-kube-api-access-pm2gd\") pod \"glance-db-create-ks97b\" (UID: \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\") " pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.790749 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjbh6\" (UniqueName: \"kubernetes.io/projected/92a177a3-ca3b-4aad-8365-791bbc65e089-kube-api-access-xjbh6\") pod \"glance-be2b-account-create-update-fw7h5\" (UID: \"92a177a3-ca3b-4aad-8365-791bbc65e089\") " pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.790899 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-operator-scripts\") pod \"glance-db-create-ks97b\" (UID: \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\") " pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.790966 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92a177a3-ca3b-4aad-8365-791bbc65e089-operator-scripts\") pod \"glance-be2b-account-create-update-fw7h5\" (UID: \"92a177a3-ca3b-4aad-8365-791bbc65e089\") " pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.791013 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2gd\" (UniqueName: \"kubernetes.io/projected/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-kube-api-access-pm2gd\") pod \"glance-db-create-ks97b\" (UID: \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\") " pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.792277 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-operator-scripts\") pod \"glance-db-create-ks97b\" (UID: \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\") " pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.810126 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2gd\" (UniqueName: \"kubernetes.io/projected/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-kube-api-access-pm2gd\") pod \"glance-db-create-ks97b\" (UID: \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\") " pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.893283 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92a177a3-ca3b-4aad-8365-791bbc65e089-operator-scripts\") pod \"glance-be2b-account-create-update-fw7h5\" (UID: \"92a177a3-ca3b-4aad-8365-791bbc65e089\") " pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.893412 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjbh6\" (UniqueName: \"kubernetes.io/projected/92a177a3-ca3b-4aad-8365-791bbc65e089-kube-api-access-xjbh6\") pod \"glance-be2b-account-create-update-fw7h5\" (UID: \"92a177a3-ca3b-4aad-8365-791bbc65e089\") " pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.894340 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92a177a3-ca3b-4aad-8365-791bbc65e089-operator-scripts\") pod \"glance-be2b-account-create-update-fw7h5\" (UID: \"92a177a3-ca3b-4aad-8365-791bbc65e089\") " pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.909426 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjbh6\" (UniqueName: \"kubernetes.io/projected/92a177a3-ca3b-4aad-8365-791bbc65e089-kube-api-access-xjbh6\") pod \"glance-be2b-account-create-update-fw7h5\" (UID: \"92a177a3-ca3b-4aad-8365-791bbc65e089\") " pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:56 crc kubenswrapper[4711]: I1203 12:48:56.999148 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.011885 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.330868 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-be2b-account-create-update-fw7h5"] Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.593059 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-ks97b"] Dec 03 12:48:57 crc kubenswrapper[4711]: W1203 12:48:57.602151 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda30f24fa_04a4_4d3c_aebb_ec95e57b1cc1.slice/crio-9a7e87c708fcb15d1a9b9b2bba7d0224dcc88f5da777ec3281199e5cdc803d3b WatchSource:0}: Error finding container 9a7e87c708fcb15d1a9b9b2bba7d0224dcc88f5da777ec3281199e5cdc803d3b: Status 404 returned error can't find the container with id 9a7e87c708fcb15d1a9b9b2bba7d0224dcc88f5da777ec3281199e5cdc803d3b Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.929581 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-ks97b" event={"ID":"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1","Type":"ContainerStarted","Data":"4cf4fdd4e81c79e46e70376eee022c59b7215343bd29393a65871cd49825ffe3"} Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.929630 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-ks97b" event={"ID":"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1","Type":"ContainerStarted","Data":"9a7e87c708fcb15d1a9b9b2bba7d0224dcc88f5da777ec3281199e5cdc803d3b"} Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.931065 4711 generic.go:334] "Generic (PLEG): container finished" podID="92a177a3-ca3b-4aad-8365-791bbc65e089" containerID="606b5eecb5a23572e62b5a0263b6dcead6590b063fe003f1ea4ff62851f38799" exitCode=0 Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.931098 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" event={"ID":"92a177a3-ca3b-4aad-8365-791bbc65e089","Type":"ContainerDied","Data":"606b5eecb5a23572e62b5a0263b6dcead6590b063fe003f1ea4ff62851f38799"} Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.931118 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" event={"ID":"92a177a3-ca3b-4aad-8365-791bbc65e089","Type":"ContainerStarted","Data":"125d8377ef8d944cc9a3a8fad6719e1f8b7df61f4840d299b4dbe512f441a51b"} Dec 03 12:48:57 crc kubenswrapper[4711]: I1203 12:48:57.953173 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-ks97b" podStartSLOduration=1.953155258 podStartE2EDuration="1.953155258s" podCreationTimestamp="2025-12-03 12:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:57.948783348 +0000 UTC m=+2056.618034623" watchObservedRunningTime="2025-12-03 12:48:57.953155258 +0000 UTC m=+2056.622406513" Dec 03 12:48:58 crc kubenswrapper[4711]: I1203 12:48:58.937253 4711 generic.go:334] "Generic (PLEG): container finished" podID="a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1" containerID="4cf4fdd4e81c79e46e70376eee022c59b7215343bd29393a65871cd49825ffe3" exitCode=0 Dec 03 12:48:58 crc kubenswrapper[4711]: I1203 12:48:58.938981 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-ks97b" event={"ID":"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1","Type":"ContainerDied","Data":"4cf4fdd4e81c79e46e70376eee022c59b7215343bd29393a65871cd49825ffe3"} Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.230017 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.328281 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92a177a3-ca3b-4aad-8365-791bbc65e089-operator-scripts\") pod \"92a177a3-ca3b-4aad-8365-791bbc65e089\" (UID: \"92a177a3-ca3b-4aad-8365-791bbc65e089\") " Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.328495 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjbh6\" (UniqueName: \"kubernetes.io/projected/92a177a3-ca3b-4aad-8365-791bbc65e089-kube-api-access-xjbh6\") pod \"92a177a3-ca3b-4aad-8365-791bbc65e089\" (UID: \"92a177a3-ca3b-4aad-8365-791bbc65e089\") " Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.328823 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a177a3-ca3b-4aad-8365-791bbc65e089-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92a177a3-ca3b-4aad-8365-791bbc65e089" (UID: "92a177a3-ca3b-4aad-8365-791bbc65e089"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.334997 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a177a3-ca3b-4aad-8365-791bbc65e089-kube-api-access-xjbh6" (OuterVolumeSpecName: "kube-api-access-xjbh6") pod "92a177a3-ca3b-4aad-8365-791bbc65e089" (UID: "92a177a3-ca3b-4aad-8365-791bbc65e089"). InnerVolumeSpecName "kube-api-access-xjbh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.429855 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjbh6\" (UniqueName: \"kubernetes.io/projected/92a177a3-ca3b-4aad-8365-791bbc65e089-kube-api-access-xjbh6\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.429891 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92a177a3-ca3b-4aad-8365-791bbc65e089-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.946132 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.946964 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-be2b-account-create-update-fw7h5" event={"ID":"92a177a3-ca3b-4aad-8365-791bbc65e089","Type":"ContainerDied","Data":"125d8377ef8d944cc9a3a8fad6719e1f8b7df61f4840d299b4dbe512f441a51b"} Dec 03 12:48:59 crc kubenswrapper[4711]: I1203 12:48:59.946988 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="125d8377ef8d944cc9a3a8fad6719e1f8b7df61f4840d299b4dbe512f441a51b" Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.258082 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.346764 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-operator-scripts\") pod \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\" (UID: \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\") " Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.347047 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm2gd\" (UniqueName: \"kubernetes.io/projected/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-kube-api-access-pm2gd\") pod \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\" (UID: \"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1\") " Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.347602 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1" (UID: "a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.351678 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-kube-api-access-pm2gd" (OuterVolumeSpecName: "kube-api-access-pm2gd") pod "a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1" (UID: "a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1"). InnerVolumeSpecName "kube-api-access-pm2gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.448410 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm2gd\" (UniqueName: \"kubernetes.io/projected/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-kube-api-access-pm2gd\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.448474 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.977890 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-ks97b" event={"ID":"a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1","Type":"ContainerDied","Data":"9a7e87c708fcb15d1a9b9b2bba7d0224dcc88f5da777ec3281199e5cdc803d3b"} Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.978319 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a7e87c708fcb15d1a9b9b2bba7d0224dcc88f5da777ec3281199e5cdc803d3b" Dec 03 12:49:00 crc kubenswrapper[4711]: I1203 12:49:00.977973 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-ks97b" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.769318 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-ht7xq"] Dec 03 12:49:01 crc kubenswrapper[4711]: E1203 12:49:01.769660 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a177a3-ca3b-4aad-8365-791bbc65e089" containerName="mariadb-account-create-update" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.769681 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a177a3-ca3b-4aad-8365-791bbc65e089" containerName="mariadb-account-create-update" Dec 03 12:49:01 crc kubenswrapper[4711]: E1203 12:49:01.769721 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1" containerName="mariadb-database-create" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.769731 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1" containerName="mariadb-database-create" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.769933 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1" containerName="mariadb-database-create" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.769961 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a177a3-ca3b-4aad-8365-791bbc65e089" containerName="mariadb-account-create-update" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.770590 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.772990 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-8slzn" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.774218 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.776263 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ht7xq"] Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.875776 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrvq\" (UniqueName: \"kubernetes.io/projected/864de37c-1231-4155-988a-14b4bcb9c3aa-kube-api-access-hcrvq\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.875857 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-db-sync-config-data\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.875929 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-config-data\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.977760 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrvq\" (UniqueName: \"kubernetes.io/projected/864de37c-1231-4155-988a-14b4bcb9c3aa-kube-api-access-hcrvq\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.977989 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-db-sync-config-data\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.978100 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-config-data\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.985732 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-config-data\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.986492 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-db-sync-config-data\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:01 crc kubenswrapper[4711]: I1203 12:49:01.998998 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrvq\" (UniqueName: \"kubernetes.io/projected/864de37c-1231-4155-988a-14b4bcb9c3aa-kube-api-access-hcrvq\") pod \"glance-db-sync-ht7xq\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:02 crc kubenswrapper[4711]: I1203 12:49:02.096234 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:02 crc kubenswrapper[4711]: I1203 12:49:02.518000 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ht7xq"] Dec 03 12:49:02 crc kubenswrapper[4711]: W1203 12:49:02.519534 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod864de37c_1231_4155_988a_14b4bcb9c3aa.slice/crio-4634a4e5505f932e478a2a487ba5599e42eb9f574a313bc01b9bcbde99bb4894 WatchSource:0}: Error finding container 4634a4e5505f932e478a2a487ba5599e42eb9f574a313bc01b9bcbde99bb4894: Status 404 returned error can't find the container with id 4634a4e5505f932e478a2a487ba5599e42eb9f574a313bc01b9bcbde99bb4894 Dec 03 12:49:02 crc kubenswrapper[4711]: I1203 12:49:02.995788 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ht7xq" event={"ID":"864de37c-1231-4155-988a-14b4bcb9c3aa","Type":"ContainerStarted","Data":"4634a4e5505f932e478a2a487ba5599e42eb9f574a313bc01b9bcbde99bb4894"} Dec 03 12:49:04 crc kubenswrapper[4711]: I1203 12:49:04.003613 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ht7xq" event={"ID":"864de37c-1231-4155-988a-14b4bcb9c3aa","Type":"ContainerStarted","Data":"c5ea8bb6ee74cb78f9da8d1839ea6f5a9861ebc94eebd9fb1aa97cd9773f83a5"} Dec 03 12:49:04 crc kubenswrapper[4711]: I1203 12:49:04.021590 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-ht7xq" podStartSLOduration=3.021571845 podStartE2EDuration="3.021571845s" podCreationTimestamp="2025-12-03 12:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:49:04.018368997 +0000 UTC m=+2062.687620252" watchObservedRunningTime="2025-12-03 12:49:04.021571845 +0000 UTC m=+2062.690823100" Dec 03 12:49:05 crc kubenswrapper[4711]: I1203 12:49:05.401693 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:49:05 crc kubenswrapper[4711]: I1203 12:49:05.401757 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:49:05 crc kubenswrapper[4711]: I1203 12:49:05.401799 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:49:05 crc kubenswrapper[4711]: I1203 12:49:05.402433 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf5aaef3ea300423cb3fae0894d5ba5f74a380ce9c593a8a024e4a722f79eb6b"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:49:05 crc kubenswrapper[4711]: I1203 12:49:05.402484 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://bf5aaef3ea300423cb3fae0894d5ba5f74a380ce9c593a8a024e4a722f79eb6b" gracePeriod=600 Dec 03 12:49:06 crc kubenswrapper[4711]: I1203 12:49:06.024100 4711 generic.go:334] "Generic (PLEG): container finished" podID="864de37c-1231-4155-988a-14b4bcb9c3aa" containerID="c5ea8bb6ee74cb78f9da8d1839ea6f5a9861ebc94eebd9fb1aa97cd9773f83a5" exitCode=0 Dec 03 12:49:06 crc kubenswrapper[4711]: I1203 12:49:06.024403 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ht7xq" event={"ID":"864de37c-1231-4155-988a-14b4bcb9c3aa","Type":"ContainerDied","Data":"c5ea8bb6ee74cb78f9da8d1839ea6f5a9861ebc94eebd9fb1aa97cd9773f83a5"} Dec 03 12:49:06 crc kubenswrapper[4711]: I1203 12:49:06.028480 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="bf5aaef3ea300423cb3fae0894d5ba5f74a380ce9c593a8a024e4a722f79eb6b" exitCode=0 Dec 03 12:49:06 crc kubenswrapper[4711]: I1203 12:49:06.028521 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"bf5aaef3ea300423cb3fae0894d5ba5f74a380ce9c593a8a024e4a722f79eb6b"} Dec 03 12:49:06 crc kubenswrapper[4711]: I1203 12:49:06.028547 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3"} Dec 03 12:49:06 crc kubenswrapper[4711]: I1203 12:49:06.028564 4711 scope.go:117] "RemoveContainer" containerID="8aceb8eade31346b968b79b9eafdd0ba0740fbe4943a8f7cf724864aa6dd593f" Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.320748 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.371664 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-db-sync-config-data\") pod \"864de37c-1231-4155-988a-14b4bcb9c3aa\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.371818 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-config-data\") pod \"864de37c-1231-4155-988a-14b4bcb9c3aa\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.371945 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcrvq\" (UniqueName: \"kubernetes.io/projected/864de37c-1231-4155-988a-14b4bcb9c3aa-kube-api-access-hcrvq\") pod \"864de37c-1231-4155-988a-14b4bcb9c3aa\" (UID: \"864de37c-1231-4155-988a-14b4bcb9c3aa\") " Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.379412 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864de37c-1231-4155-988a-14b4bcb9c3aa-kube-api-access-hcrvq" (OuterVolumeSpecName: "kube-api-access-hcrvq") pod "864de37c-1231-4155-988a-14b4bcb9c3aa" (UID: "864de37c-1231-4155-988a-14b4bcb9c3aa"). InnerVolumeSpecName "kube-api-access-hcrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.380162 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "864de37c-1231-4155-988a-14b4bcb9c3aa" (UID: "864de37c-1231-4155-988a-14b4bcb9c3aa"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.428229 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-config-data" (OuterVolumeSpecName: "config-data") pod "864de37c-1231-4155-988a-14b4bcb9c3aa" (UID: "864de37c-1231-4155-988a-14b4bcb9c3aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.474575 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.474637 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcrvq\" (UniqueName: \"kubernetes.io/projected/864de37c-1231-4155-988a-14b4bcb9c3aa-kube-api-access-hcrvq\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:07 crc kubenswrapper[4711]: I1203 12:49:07.474659 4711 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/864de37c-1231-4155-988a-14b4bcb9c3aa-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:08 crc kubenswrapper[4711]: I1203 12:49:08.052062 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ht7xq" event={"ID":"864de37c-1231-4155-988a-14b4bcb9c3aa","Type":"ContainerDied","Data":"4634a4e5505f932e478a2a487ba5599e42eb9f574a313bc01b9bcbde99bb4894"} Dec 03 12:49:08 crc kubenswrapper[4711]: I1203 12:49:08.052109 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4634a4e5505f932e478a2a487ba5599e42eb9f574a313bc01b9bcbde99bb4894" Dec 03 12:49:08 crc kubenswrapper[4711]: I1203 12:49:08.052216 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ht7xq" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.283347 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:49:09 crc kubenswrapper[4711]: E1203 12:49:09.284003 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864de37c-1231-4155-988a-14b4bcb9c3aa" containerName="glance-db-sync" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.284019 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="864de37c-1231-4155-988a-14b4bcb9c3aa" containerName="glance-db-sync" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.284177 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="864de37c-1231-4155-988a-14b4bcb9c3aa" containerName="glance-db-sync" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.284980 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.287012 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.287276 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.287281 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-8slzn" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.297377 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402647 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402716 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-run\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402736 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-dev\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402759 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402776 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfhc\" (UniqueName: \"kubernetes.io/projected/ae7e3c8c-8b3d-46de-8359-c33d175209e7-kube-api-access-ddfhc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402792 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402816 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-logs\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402835 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-sys\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402853 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.402889 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.403023 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.403060 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.403082 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.403109 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504127 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-logs\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504175 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-sys\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504201 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504224 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504251 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504290 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504318 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504352 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504399 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504428 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-run\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504446 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-dev\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504465 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504481 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfhc\" (UniqueName: \"kubernetes.io/projected/ae7e3c8c-8b3d-46de-8359-c33d175209e7-kube-api-access-ddfhc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504497 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504557 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504952 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-logs\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.504988 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-sys\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.505423 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.507112 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.507156 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.507218 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.507235 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-dev\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.507338 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-run\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.507404 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.507454 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.513647 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.513698 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.526014 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.526775 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.529107 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfhc\" (UniqueName: \"kubernetes.io/projected/ae7e3c8c-8b3d-46de-8359-c33d175209e7-kube-api-access-ddfhc\") pod \"glance-default-external-api-0\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.578356 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.579639 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.582095 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.591091 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.600973 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.629120 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.733623 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734456 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734496 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734526 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734554 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-sys\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734589 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-dev\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734640 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-logs\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734701 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734761 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-scripts\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734789 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-run\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734808 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734864 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-config-data\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734896 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.734938 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54zpw\" (UniqueName: \"kubernetes.io/projected/02a539fb-5e76-4adf-8266-dc2fdd4f3165-kube-api-access-54zpw\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.735568 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.770250 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836460 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-dev\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836529 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-logs\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836582 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-scripts\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836602 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-run\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836618 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836639 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-config-data\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836640 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-dev\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836664 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836683 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54zpw\" (UniqueName: \"kubernetes.io/projected/02a539fb-5e76-4adf-8266-dc2fdd4f3165-kube-api-access-54zpw\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836719 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836737 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836756 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836776 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836795 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-sys\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836874 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-sys\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836927 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-run\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.836977 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.837951 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.838237 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.838558 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.838614 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.838738 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.842050 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-logs\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.842863 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-scripts\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.846614 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-config-data\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.849320 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.856687 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54zpw\" (UniqueName: \"kubernetes.io/projected/02a539fb-5e76-4adf-8266-dc2fdd4f3165-kube-api-access-54zpw\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.870058 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:09 crc kubenswrapper[4711]: I1203 12:49:09.894944 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:10 crc kubenswrapper[4711]: I1203 12:49:10.067181 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ae7e3c8c-8b3d-46de-8359-c33d175209e7","Type":"ContainerStarted","Data":"68c329d5602a09881a0239bbd20b701898d1d6687559040c8711914159eccb84"} Dec 03 12:49:10 crc kubenswrapper[4711]: I1203 12:49:10.228023 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:10 crc kubenswrapper[4711]: I1203 12:49:10.387360 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.077327 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ae7e3c8c-8b3d-46de-8359-c33d175209e7","Type":"ContainerStarted","Data":"6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa"} Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.077865 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ae7e3c8c-8b3d-46de-8359-c33d175209e7","Type":"ContainerStarted","Data":"9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09"} Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.079264 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"02a539fb-5e76-4adf-8266-dc2fdd4f3165","Type":"ContainerStarted","Data":"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312"} Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.079322 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"02a539fb-5e76-4adf-8266-dc2fdd4f3165","Type":"ContainerStarted","Data":"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73"} Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.079342 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"02a539fb-5e76-4adf-8266-dc2fdd4f3165","Type":"ContainerStarted","Data":"795de34211a986ac75d8137f8a781c2eb807b36e5c982cc3d99aff20951b5bd5"} Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.079378 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerName="glance-log" containerID="cri-o://073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73" gracePeriod=30 Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.079411 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerName="glance-httpd" containerID="cri-o://f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312" gracePeriod=30 Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.129255 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.129216392 podStartE2EDuration="3.129216392s" podCreationTimestamp="2025-12-03 12:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:49:11.128200244 +0000 UTC m=+2069.797451519" watchObservedRunningTime="2025-12-03 12:49:11.129216392 +0000 UTC m=+2069.798467657" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.134258 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.1342482289999998 podStartE2EDuration="2.134248229s" podCreationTimestamp="2025-12-03 12:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:49:11.105704736 +0000 UTC m=+2069.774956011" watchObservedRunningTime="2025-12-03 12:49:11.134248229 +0000 UTC m=+2069.803499504" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.584729 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.667475 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-scripts\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.667867 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-run\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.667893 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54zpw\" (UniqueName: \"kubernetes.io/projected/02a539fb-5e76-4adf-8266-dc2fdd4f3165-kube-api-access-54zpw\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.667944 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-logs\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.667965 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668041 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-run" (OuterVolumeSpecName: "run") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668176 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-var-locks-brick\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668220 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-nvme\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668252 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668253 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668271 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-httpd-run\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668350 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668445 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-iscsi\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668493 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-sys\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668521 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-config-data\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668582 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-dev\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668648 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-lib-modules\") pod \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\" (UID: \"02a539fb-5e76-4adf-8266-dc2fdd4f3165\") " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668738 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-sys" (OuterVolumeSpecName: "sys") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668769 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668922 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-dev" (OuterVolumeSpecName: "dev") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.668963 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669188 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-logs" (OuterVolumeSpecName: "logs") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669354 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669373 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669383 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669433 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669393 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669467 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669477 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669487 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.669497 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/02a539fb-5e76-4adf-8266-dc2fdd4f3165-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.673687 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.674013 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a539fb-5e76-4adf-8266-dc2fdd4f3165-kube-api-access-54zpw" (OuterVolumeSpecName: "kube-api-access-54zpw") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "kube-api-access-54zpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.674042 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.679230 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-scripts" (OuterVolumeSpecName: "scripts") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.710287 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-config-data" (OuterVolumeSpecName: "config-data") pod "02a539fb-5e76-4adf-8266-dc2fdd4f3165" (UID: "02a539fb-5e76-4adf-8266-dc2fdd4f3165"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.771007 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.771090 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54zpw\" (UniqueName: \"kubernetes.io/projected/02a539fb-5e76-4adf-8266-dc2fdd4f3165-kube-api-access-54zpw\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.771170 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.771187 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.771211 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02a539fb-5e76-4adf-8266-dc2fdd4f3165-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.771222 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a539fb-5e76-4adf-8266-dc2fdd4f3165-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.783653 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.788208 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.872551 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:11 crc kubenswrapper[4711]: I1203 12:49:11.872580 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.089313 4711 generic.go:334] "Generic (PLEG): container finished" podID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerID="f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312" exitCode=143 Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.089357 4711 generic.go:334] "Generic (PLEG): container finished" podID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerID="073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73" exitCode=143 Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.089425 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.089459 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"02a539fb-5e76-4adf-8266-dc2fdd4f3165","Type":"ContainerDied","Data":"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312"} Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.089519 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"02a539fb-5e76-4adf-8266-dc2fdd4f3165","Type":"ContainerDied","Data":"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73"} Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.089540 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"02a539fb-5e76-4adf-8266-dc2fdd4f3165","Type":"ContainerDied","Data":"795de34211a986ac75d8137f8a781c2eb807b36e5c982cc3d99aff20951b5bd5"} Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.089564 4711 scope.go:117] "RemoveContainer" containerID="f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.127844 4711 scope.go:117] "RemoveContainer" containerID="073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.169240 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.178131 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.186385 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:12 crc kubenswrapper[4711]: E1203 12:49:12.186937 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerName="glance-httpd" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.186966 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerName="glance-httpd" Dec 03 12:49:12 crc kubenswrapper[4711]: E1203 12:49:12.186989 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerName="glance-log" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.187002 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerName="glance-log" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.187228 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerName="glance-httpd" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.187273 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" containerName="glance-log" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.188487 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.191420 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.195264 4711 scope.go:117] "RemoveContainer" containerID="f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312" Dec 03 12:49:12 crc kubenswrapper[4711]: E1203 12:49:12.197296 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312\": container with ID starting with f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312 not found: ID does not exist" containerID="f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.197344 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312"} err="failed to get container status \"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312\": rpc error: code = NotFound desc = could not find container \"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312\": container with ID starting with f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312 not found: ID does not exist" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.197374 4711 scope.go:117] "RemoveContainer" containerID="073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73" Dec 03 12:49:12 crc kubenswrapper[4711]: E1203 12:49:12.198051 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73\": container with ID starting with 073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73 not found: ID does not exist" containerID="073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.198183 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73"} err="failed to get container status \"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73\": rpc error: code = NotFound desc = could not find container \"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73\": container with ID starting with 073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73 not found: ID does not exist" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.198339 4711 scope.go:117] "RemoveContainer" containerID="f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.198819 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312"} err="failed to get container status \"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312\": rpc error: code = NotFound desc = could not find container \"f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312\": container with ID starting with f2e5d69d1b0165b02148b8ed866cc13ba4b65f4129fd5ed74c3af8f326631312 not found: ID does not exist" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.198851 4711 scope.go:117] "RemoveContainer" containerID="073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.202628 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.204111 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73"} err="failed to get container status \"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73\": rpc error: code = NotFound desc = could not find container \"073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73\": container with ID starting with 073f303eb8b54644ae18da217799fa62cdf3d77061498b2ab774b1d7b3b97b73 not found: ID does not exist" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.288771 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-dev\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.288820 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.288854 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.288975 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.288998 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289022 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-sys\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289059 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289094 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289136 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289168 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-run\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289191 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289216 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289671 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.289708 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnqkp\" (UniqueName: \"kubernetes.io/projected/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-kube-api-access-jnqkp\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391308 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391367 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391404 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391427 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-run\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391445 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391462 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391478 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391497 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnqkp\" (UniqueName: \"kubernetes.io/projected/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-kube-api-access-jnqkp\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391523 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-dev\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391538 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391554 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391572 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-run\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391595 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391652 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391693 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-sys\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391792 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.391825 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-sys\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.392212 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.392234 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.392286 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-dev\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.392361 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.392438 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.392560 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.393030 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.394176 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.401154 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.402936 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.414539 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnqkp\" (UniqueName: \"kubernetes.io/projected/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-kube-api-access-jnqkp\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.423324 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.463160 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.515607 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:12 crc kubenswrapper[4711]: I1203 12:49:12.828419 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:12 crc kubenswrapper[4711]: W1203 12:49:12.834778 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bfce555_bbde_4b58_9fcc_13b1e2db8a3c.slice/crio-ed33e328d018aac630d6c557d0ab55b44979cece8ace2aa2ad8f58acd1f3fff1 WatchSource:0}: Error finding container ed33e328d018aac630d6c557d0ab55b44979cece8ace2aa2ad8f58acd1f3fff1: Status 404 returned error can't find the container with id ed33e328d018aac630d6c557d0ab55b44979cece8ace2aa2ad8f58acd1f3fff1 Dec 03 12:49:13 crc kubenswrapper[4711]: I1203 12:49:13.103943 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c","Type":"ContainerStarted","Data":"307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7"} Dec 03 12:49:13 crc kubenswrapper[4711]: I1203 12:49:13.103997 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c","Type":"ContainerStarted","Data":"ed33e328d018aac630d6c557d0ab55b44979cece8ace2aa2ad8f58acd1f3fff1"} Dec 03 12:49:13 crc kubenswrapper[4711]: I1203 12:49:13.829461 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a539fb-5e76-4adf-8266-dc2fdd4f3165" path="/var/lib/kubelet/pods/02a539fb-5e76-4adf-8266-dc2fdd4f3165/volumes" Dec 03 12:49:14 crc kubenswrapper[4711]: I1203 12:49:14.125272 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c","Type":"ContainerStarted","Data":"bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba"} Dec 03 12:49:14 crc kubenswrapper[4711]: I1203 12:49:14.163140 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.163107116 podStartE2EDuration="2.163107116s" podCreationTimestamp="2025-12-03 12:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:49:14.148371712 +0000 UTC m=+2072.817622977" watchObservedRunningTime="2025-12-03 12:49:14.163107116 +0000 UTC m=+2072.832358411" Dec 03 12:49:19 crc kubenswrapper[4711]: I1203 12:49:19.602679 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:19 crc kubenswrapper[4711]: I1203 12:49:19.604226 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:19 crc kubenswrapper[4711]: I1203 12:49:19.642135 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:19 crc kubenswrapper[4711]: I1203 12:49:19.664607 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:20 crc kubenswrapper[4711]: I1203 12:49:20.176418 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:20 crc kubenswrapper[4711]: I1203 12:49:20.176496 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:22 crc kubenswrapper[4711]: I1203 12:49:22.394243 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:22 crc kubenswrapper[4711]: I1203 12:49:22.394649 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:49:22 crc kubenswrapper[4711]: I1203 12:49:22.400373 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:22 crc kubenswrapper[4711]: I1203 12:49:22.516732 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:22 crc kubenswrapper[4711]: I1203 12:49:22.516793 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:22 crc kubenswrapper[4711]: I1203 12:49:22.539344 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:22 crc kubenswrapper[4711]: I1203 12:49:22.565003 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:23 crc kubenswrapper[4711]: I1203 12:49:23.202174 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:23 crc kubenswrapper[4711]: I1203 12:49:23.202245 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:25 crc kubenswrapper[4711]: I1203 12:49:25.140575 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:25 crc kubenswrapper[4711]: I1203 12:49:25.221201 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:49:25 crc kubenswrapper[4711]: I1203 12:49:25.223875 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.110731 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.111958 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.116793 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.118209 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.132954 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.159575 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237300 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-run\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237368 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-scripts\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237408 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237433 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237473 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-scripts\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237500 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-config-data\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237522 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-dev\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237551 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237571 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-logs\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237593 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237617 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-run\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237644 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-logs\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237663 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-dev\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237686 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237720 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj9cl\" (UniqueName: \"kubernetes.io/projected/e22a45f5-3560-498f-8faa-e31d07aa4d48-kube-api-access-mj9cl\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237749 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237769 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncf9h\" (UniqueName: \"kubernetes.io/projected/81590464-a00d-470b-86f6-8050913cf609-kube-api-access-ncf9h\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237790 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237816 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237836 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237855 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237876 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237943 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.237978 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.238007 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-sys\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.238031 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.238057 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-config-data\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.238076 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-sys\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.260044 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.261097 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.270770 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.278493 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.282083 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.292401 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.339826 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-sys\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.339879 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.339921 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.339943 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-logs\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.339975 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-run\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340002 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340031 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-scripts\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340062 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340086 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340109 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340134 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-scripts\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340155 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340189 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-config-data\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340242 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-dev\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340271 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340291 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-logs\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340314 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340340 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340366 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-run\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340395 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-logs\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340417 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-dev\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340442 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340470 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-run\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340495 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-dev\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340526 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj9cl\" (UniqueName: \"kubernetes.io/projected/e22a45f5-3560-498f-8faa-e31d07aa4d48-kube-api-access-mj9cl\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340554 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340582 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncf9h\" (UniqueName: \"kubernetes.io/projected/81590464-a00d-470b-86f6-8050913cf609-kube-api-access-ncf9h\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340605 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340631 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340671 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340691 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340830 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340872 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340902 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c92f\" (UniqueName: \"kubernetes.io/projected/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-kube-api-access-8c92f\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340956 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.340983 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.341012 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.341034 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.341060 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-sys\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.341085 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.341109 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-config-data\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.341136 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-sys\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.341229 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-sys\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.341474 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-run\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.342047 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.343360 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.343461 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.343595 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.345223 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.345478 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.345511 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-sys\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.345545 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.345569 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.345742 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346157 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-logs\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346221 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346263 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-run\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346297 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-dev\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346731 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346790 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346810 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-dev\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346842 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.346865 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-logs\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.347176 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.348991 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-scripts\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.350608 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-scripts\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.351320 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-config-data\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.373928 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj9cl\" (UniqueName: \"kubernetes.io/projected/e22a45f5-3560-498f-8faa-e31d07aa4d48-kube-api-access-mj9cl\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.381291 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-config-data\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.386820 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncf9h\" (UniqueName: \"kubernetes.io/projected/81590464-a00d-470b-86f6-8050913cf609-kube-api-access-ncf9h\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.391127 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.392474 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.396082 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.399606 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.435694 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442228 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442268 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442313 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-logs\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442331 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442350 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442384 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442383 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442420 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-run\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442440 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-dev\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442462 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-dev\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442465 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-run\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442476 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442538 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-scripts\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442584 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442620 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442628 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442662 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c92f\" (UniqueName: \"kubernetes.io/projected/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-kube-api-access-8c92f\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442683 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442703 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442719 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442737 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442766 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442810 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442838 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-config-data\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442861 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-sys\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442876 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-run\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442947 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-sys\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.442972 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-dev\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.443001 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.443037 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.443010 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.443934 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.443970 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-logs\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.443995 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.444950 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.445084 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.445234 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.444012 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbgg9\" (UniqueName: \"kubernetes.io/projected/f962c90c-c114-413c-804e-999d1b936b65-kube-api-access-gbgg9\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.445298 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-sys\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.445539 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-logs\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.445789 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.445900 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.458020 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c92f\" (UniqueName: \"kubernetes.io/projected/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-kube-api-access-8c92f\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.463932 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.465746 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547067 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbgg9\" (UniqueName: \"kubernetes.io/projected/f962c90c-c114-413c-804e-999d1b936b65-kube-api-access-gbgg9\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547117 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-sys\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547187 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-logs\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547227 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547279 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547301 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-scripts\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547327 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547351 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547378 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547401 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547427 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547461 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-config-data\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547484 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-run\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547510 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-dev\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.547603 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-dev\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.548018 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-sys\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.548741 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-logs\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.549033 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.552145 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.552201 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.552773 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.552853 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.552993 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.554247 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.557280 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-run\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.559738 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-scripts\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.564396 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-config-data\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.569585 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbgg9\" (UniqueName: \"kubernetes.io/projected/f962c90c-c114-413c-804e-999d1b936b65-kube-api-access-gbgg9\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.574827 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.578587 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.594383 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-2\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.713609 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:49:27 crc kubenswrapper[4711]: W1203 12:49:27.723596 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81590464_a00d_470b_86f6_8050913cf609.slice/crio-ac695d207b75fa0940061679c874e22a459034d3b8a7bbfae8ac0f921e91f331 WatchSource:0}: Error finding container ac695d207b75fa0940061679c874e22a459034d3b8a7bbfae8ac0f921e91f331: Status 404 returned error can't find the container with id ac695d207b75fa0940061679c874e22a459034d3b8a7bbfae8ac0f921e91f331 Dec 03 12:49:27 crc kubenswrapper[4711]: I1203 12:49:27.893517 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.003067 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.091026 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:49:28 crc kubenswrapper[4711]: W1203 12:49:28.093867 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c1ad65c_d3ad_499e_a794_0d482cbe7df9.slice/crio-66921293016b419659e6dc70c6cbae30f53c58a2684e560791311950834290a0 WatchSource:0}: Error finding container 66921293016b419659e6dc70c6cbae30f53c58a2684e560791311950834290a0: Status 404 returned error can't find the container with id 66921293016b419659e6dc70c6cbae30f53c58a2684e560791311950834290a0 Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.252466 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"e22a45f5-3560-498f-8faa-e31d07aa4d48","Type":"ContainerStarted","Data":"6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e"} Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.252531 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"e22a45f5-3560-498f-8faa-e31d07aa4d48","Type":"ContainerStarted","Data":"f9621c3eaaa75c0fccc72a3cab3b5099742d580461218279b9942932f7277f35"} Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.254418 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1c1ad65c-d3ad-499e-a794-0d482cbe7df9","Type":"ContainerStarted","Data":"4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706"} Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.254458 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1c1ad65c-d3ad-499e-a794-0d482cbe7df9","Type":"ContainerStarted","Data":"66921293016b419659e6dc70c6cbae30f53c58a2684e560791311950834290a0"} Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.256282 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"81590464-a00d-470b-86f6-8050913cf609","Type":"ContainerStarted","Data":"71b7db0ba0ed925298d54b5816cc72f3f57405454e4f25a2b49cad157ced4195"} Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.256305 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"81590464-a00d-470b-86f6-8050913cf609","Type":"ContainerStarted","Data":"03d7f7bc9ade46a3241c9d6dfa6fc3c01460d82e553458a970076d7715bbefe7"} Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.256318 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"81590464-a00d-470b-86f6-8050913cf609","Type":"ContainerStarted","Data":"ac695d207b75fa0940061679c874e22a459034d3b8a7bbfae8ac0f921e91f331"} Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.280516 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=2.280497882 podStartE2EDuration="2.280497882s" podCreationTimestamp="2025-12-03 12:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:49:28.277021847 +0000 UTC m=+2086.946273112" watchObservedRunningTime="2025-12-03 12:49:28.280497882 +0000 UTC m=+2086.949749147" Dec 03 12:49:28 crc kubenswrapper[4711]: I1203 12:49:28.365685 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:49:28 crc kubenswrapper[4711]: W1203 12:49:28.377133 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf962c90c_c114_413c_804e_999d1b936b65.slice/crio-1f0467aa6a2211d8f905febcced419c5edeb852ca05efc8a4e6232da0e0ea7b6 WatchSource:0}: Error finding container 1f0467aa6a2211d8f905febcced419c5edeb852ca05efc8a4e6232da0e0ea7b6: Status 404 returned error can't find the container with id 1f0467aa6a2211d8f905febcced419c5edeb852ca05efc8a4e6232da0e0ea7b6 Dec 03 12:49:29 crc kubenswrapper[4711]: I1203 12:49:29.269880 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"e22a45f5-3560-498f-8faa-e31d07aa4d48","Type":"ContainerStarted","Data":"05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82"} Dec 03 12:49:29 crc kubenswrapper[4711]: I1203 12:49:29.272391 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f962c90c-c114-413c-804e-999d1b936b65","Type":"ContainerStarted","Data":"857ef73900a457ab1fba1313522ab4ce0d259b2ac31b4d9ea1f5be86ef42b1ff"} Dec 03 12:49:29 crc kubenswrapper[4711]: I1203 12:49:29.272436 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f962c90c-c114-413c-804e-999d1b936b65","Type":"ContainerStarted","Data":"c24c104790aa2954664ca3aedaefe267a8358a3abd39a0c1080aac720717cbaf"} Dec 03 12:49:29 crc kubenswrapper[4711]: I1203 12:49:29.272454 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f962c90c-c114-413c-804e-999d1b936b65","Type":"ContainerStarted","Data":"1f0467aa6a2211d8f905febcced419c5edeb852ca05efc8a4e6232da0e0ea7b6"} Dec 03 12:49:29 crc kubenswrapper[4711]: I1203 12:49:29.274856 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1c1ad65c-d3ad-499e-a794-0d482cbe7df9","Type":"ContainerStarted","Data":"33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c"} Dec 03 12:49:29 crc kubenswrapper[4711]: I1203 12:49:29.300257 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.3002392560000002 podStartE2EDuration="3.300239256s" podCreationTimestamp="2025-12-03 12:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:49:29.295076164 +0000 UTC m=+2087.964327429" watchObservedRunningTime="2025-12-03 12:49:29.300239256 +0000 UTC m=+2087.969490511" Dec 03 12:49:29 crc kubenswrapper[4711]: I1203 12:49:29.353436 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.352055063 podStartE2EDuration="3.352055063s" podCreationTimestamp="2025-12-03 12:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:49:29.319651006 +0000 UTC m=+2087.988902291" watchObservedRunningTime="2025-12-03 12:49:29.352055063 +0000 UTC m=+2088.021306318" Dec 03 12:49:29 crc kubenswrapper[4711]: I1203 12:49:29.363592 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.363562488 podStartE2EDuration="3.363562488s" podCreationTimestamp="2025-12-03 12:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:49:29.344437764 +0000 UTC m=+2088.013689029" watchObservedRunningTime="2025-12-03 12:49:29.363562488 +0000 UTC m=+2088.032813753" Dec 03 12:49:32 crc kubenswrapper[4711]: I1203 12:49:32.888075 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cvsph"] Dec 03 12:49:32 crc kubenswrapper[4711]: I1203 12:49:32.890876 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:32 crc kubenswrapper[4711]: I1203 12:49:32.899673 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvsph"] Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.051138 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-utilities\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.051264 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-catalog-content\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.051290 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-444ff\" (UniqueName: \"kubernetes.io/projected/aab1c8ee-fd97-49f0-8376-67af38c710f3-kube-api-access-444ff\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.153289 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-utilities\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.153675 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-catalog-content\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.153698 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-444ff\" (UniqueName: \"kubernetes.io/projected/aab1c8ee-fd97-49f0-8376-67af38c710f3-kube-api-access-444ff\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.153999 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-utilities\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.154251 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-catalog-content\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.172196 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-444ff\" (UniqueName: \"kubernetes.io/projected/aab1c8ee-fd97-49f0-8376-67af38c710f3-kube-api-access-444ff\") pod \"redhat-marketplace-cvsph\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.219333 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:33 crc kubenswrapper[4711]: I1203 12:49:33.661953 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvsph"] Dec 03 12:49:33 crc kubenswrapper[4711]: W1203 12:49:33.662448 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab1c8ee_fd97_49f0_8376_67af38c710f3.slice/crio-d8dcc024dd590af54cd63e0cc9852a22a944b8f97157c2c92ce5ef845357095e WatchSource:0}: Error finding container d8dcc024dd590af54cd63e0cc9852a22a944b8f97157c2c92ce5ef845357095e: Status 404 returned error can't find the container with id d8dcc024dd590af54cd63e0cc9852a22a944b8f97157c2c92ce5ef845357095e Dec 03 12:49:34 crc kubenswrapper[4711]: I1203 12:49:34.318394 4711 generic.go:334] "Generic (PLEG): container finished" podID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerID="dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751" exitCode=0 Dec 03 12:49:34 crc kubenswrapper[4711]: I1203 12:49:34.318450 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvsph" event={"ID":"aab1c8ee-fd97-49f0-8376-67af38c710f3","Type":"ContainerDied","Data":"dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751"} Dec 03 12:49:34 crc kubenswrapper[4711]: I1203 12:49:34.318488 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvsph" event={"ID":"aab1c8ee-fd97-49f0-8376-67af38c710f3","Type":"ContainerStarted","Data":"d8dcc024dd590af54cd63e0cc9852a22a944b8f97157c2c92ce5ef845357095e"} Dec 03 12:49:34 crc kubenswrapper[4711]: I1203 12:49:34.321239 4711 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:49:35 crc kubenswrapper[4711]: I1203 12:49:35.329497 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvsph" event={"ID":"aab1c8ee-fd97-49f0-8376-67af38c710f3","Type":"ContainerStarted","Data":"3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d"} Dec 03 12:49:36 crc kubenswrapper[4711]: I1203 12:49:36.340131 4711 generic.go:334] "Generic (PLEG): container finished" podID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerID="3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d" exitCode=0 Dec 03 12:49:36 crc kubenswrapper[4711]: I1203 12:49:36.340204 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvsph" event={"ID":"aab1c8ee-fd97-49f0-8376-67af38c710f3","Type":"ContainerDied","Data":"3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d"} Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.350419 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvsph" event={"ID":"aab1c8ee-fd97-49f0-8376-67af38c710f3","Type":"ContainerStarted","Data":"6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f"} Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.382806 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cvsph" podStartSLOduration=2.836460008 podStartE2EDuration="5.382785587s" podCreationTimestamp="2025-12-03 12:49:32 +0000 UTC" firstStartedPulling="2025-12-03 12:49:34.320858509 +0000 UTC m=+2092.990109784" lastFinishedPulling="2025-12-03 12:49:36.867184088 +0000 UTC m=+2095.536435363" observedRunningTime="2025-12-03 12:49:37.378037908 +0000 UTC m=+2096.047289253" watchObservedRunningTime="2025-12-03 12:49:37.382785587 +0000 UTC m=+2096.052036852" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.436642 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.436963 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.446957 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.447007 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.470861 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.497711 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.499983 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.505591 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.575650 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.576036 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.603710 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.610078 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.893960 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.894006 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.920128 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:37 crc kubenswrapper[4711]: I1203 12:49:37.936261 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.359255 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.359346 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.359388 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.359400 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.359410 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.359419 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.359428 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.359438 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.474148 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nvcfc"] Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.475689 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.499429 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvcfc"] Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.696574 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-catalog-content\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.696669 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f5c9\" (UniqueName: \"kubernetes.io/projected/ff7b9727-150f-4ee5-b22b-3147877bdaa6-kube-api-access-6f5c9\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.696896 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-utilities\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.799220 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f5c9\" (UniqueName: \"kubernetes.io/projected/ff7b9727-150f-4ee5-b22b-3147877bdaa6-kube-api-access-6f5c9\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.799534 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-utilities\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.799691 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-catalog-content\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.800165 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-catalog-content\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.800216 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-utilities\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:38 crc kubenswrapper[4711]: I1203 12:49:38.826889 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f5c9\" (UniqueName: \"kubernetes.io/projected/ff7b9727-150f-4ee5-b22b-3147877bdaa6-kube-api-access-6f5c9\") pod \"community-operators-nvcfc\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:39 crc kubenswrapper[4711]: I1203 12:49:39.103193 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:39 crc kubenswrapper[4711]: I1203 12:49:39.624295 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvcfc"] Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.375952 4711 generic.go:334] "Generic (PLEG): container finished" podID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerID="0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683" exitCode=0 Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.376077 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcfc" event={"ID":"ff7b9727-150f-4ee5-b22b-3147877bdaa6","Type":"ContainerDied","Data":"0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683"} Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.376346 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcfc" event={"ID":"ff7b9727-150f-4ee5-b22b-3147877bdaa6","Type":"ContainerStarted","Data":"b893e089a750a3a0b9d8fbabd8aa59b4e0e43e30bc8d0d3ae2813ba74b9420c6"} Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.401268 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.401396 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.658095 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.658216 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.668297 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.788685 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.789096 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.799772 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:40 crc kubenswrapper[4711]: I1203 12:49:40.801620 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.069135 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zrhrt"] Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.071074 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.086028 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrhrt"] Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.248054 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-utilities\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.248179 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-catalog-content\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.248364 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5xmb\" (UniqueName: \"kubernetes.io/projected/4a738f0e-1cb5-49ee-9b12-999813383fff-kube-api-access-d5xmb\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.350450 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5xmb\" (UniqueName: \"kubernetes.io/projected/4a738f0e-1cb5-49ee-9b12-999813383fff-kube-api-access-d5xmb\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.350592 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-utilities\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.350645 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-catalog-content\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.351293 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-catalog-content\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.351314 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-utilities\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.381882 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5xmb\" (UniqueName: \"kubernetes.io/projected/4a738f0e-1cb5-49ee-9b12-999813383fff-kube-api-access-d5xmb\") pod \"certified-operators-zrhrt\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.393256 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:41 crc kubenswrapper[4711]: I1203 12:49:41.719494 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrhrt"] Dec 03 12:49:42 crc kubenswrapper[4711]: I1203 12:49:42.393337 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrhrt" event={"ID":"4a738f0e-1cb5-49ee-9b12-999813383fff","Type":"ContainerStarted","Data":"a426bb585044ada421c358ea2886fb608419eddf18f6e1f6ff92a59817aeaa2a"} Dec 03 12:49:42 crc kubenswrapper[4711]: I1203 12:49:42.581690 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:42 crc kubenswrapper[4711]: I1203 12:49:42.582177 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:49:42 crc kubenswrapper[4711]: I1203 12:49:42.640529 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.219615 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.219864 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.268566 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.402734 4711 generic.go:334] "Generic (PLEG): container finished" podID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerID="997b9c3965dbe47f13ab228345e9fe8907bfafa6396e9329647eb0766916e01a" exitCode=0 Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.402829 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrhrt" event={"ID":"4a738f0e-1cb5-49ee-9b12-999813383fff","Type":"ContainerDied","Data":"997b9c3965dbe47f13ab228345e9fe8907bfafa6396e9329647eb0766916e01a"} Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.410041 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcfc" event={"ID":"ff7b9727-150f-4ee5-b22b-3147877bdaa6","Type":"ContainerStarted","Data":"5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745"} Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.479311 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.809625 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.810106 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerName="glance-httpd" containerID="cri-o://05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82" gracePeriod=30 Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.810383 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerName="glance-log" containerID="cri-o://6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e" gracePeriod=30 Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.817804 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.818059 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="81590464-a00d-470b-86f6-8050913cf609" containerName="glance-log" containerID="cri-o://03d7f7bc9ade46a3241c9d6dfa6fc3c01460d82e553458a970076d7715bbefe7" gracePeriod=30 Dec 03 12:49:43 crc kubenswrapper[4711]: I1203 12:49:43.818196 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="81590464-a00d-470b-86f6-8050913cf609" containerName="glance-httpd" containerID="cri-o://71b7db0ba0ed925298d54b5816cc72f3f57405454e4f25a2b49cad157ced4195" gracePeriod=30 Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.010605 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.012187 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="f962c90c-c114-413c-804e-999d1b936b65" containerName="glance-log" containerID="cri-o://c24c104790aa2954664ca3aedaefe267a8358a3abd39a0c1080aac720717cbaf" gracePeriod=30 Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.012706 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="f962c90c-c114-413c-804e-999d1b936b65" containerName="glance-httpd" containerID="cri-o://857ef73900a457ab1fba1313522ab4ce0d259b2ac31b4d9ea1f5be86ef42b1ff" gracePeriod=30 Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.020566 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.421664 4711 generic.go:334] "Generic (PLEG): container finished" podID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerID="5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745" exitCode=0 Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.421730 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcfc" event={"ID":"ff7b9727-150f-4ee5-b22b-3147877bdaa6","Type":"ContainerDied","Data":"5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745"} Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.424571 4711 generic.go:334] "Generic (PLEG): container finished" podID="81590464-a00d-470b-86f6-8050913cf609" containerID="03d7f7bc9ade46a3241c9d6dfa6fc3c01460d82e553458a970076d7715bbefe7" exitCode=143 Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.424654 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"81590464-a00d-470b-86f6-8050913cf609","Type":"ContainerDied","Data":"03d7f7bc9ade46a3241c9d6dfa6fc3c01460d82e553458a970076d7715bbefe7"} Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.425017 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-log" containerID="cri-o://4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706" gracePeriod=30 Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.425064 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-httpd" containerID="cri-o://33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c" gracePeriod=30 Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.432286 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.127:9292/healthcheck\": EOF" Dec 03 12:49:44 crc kubenswrapper[4711]: I1203 12:49:44.434616 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.127:9292/healthcheck\": EOF" Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.434725 4711 generic.go:334] "Generic (PLEG): container finished" podID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerID="4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706" exitCode=143 Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.434902 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1c1ad65c-d3ad-499e-a794-0d482cbe7df9","Type":"ContainerDied","Data":"4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706"} Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.437941 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcfc" event={"ID":"ff7b9727-150f-4ee5-b22b-3147877bdaa6","Type":"ContainerStarted","Data":"b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca"} Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.440366 4711 generic.go:334] "Generic (PLEG): container finished" podID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerID="b91a16cf3c5cd00790b34dd0928c60efb70847fdc5750486cd97cdeef3063351" exitCode=0 Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.440444 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrhrt" event={"ID":"4a738f0e-1cb5-49ee-9b12-999813383fff","Type":"ContainerDied","Data":"b91a16cf3c5cd00790b34dd0928c60efb70847fdc5750486cd97cdeef3063351"} Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.442174 4711 generic.go:334] "Generic (PLEG): container finished" podID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerID="6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e" exitCode=143 Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.442233 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"e22a45f5-3560-498f-8faa-e31d07aa4d48","Type":"ContainerDied","Data":"6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e"} Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.445656 4711 generic.go:334] "Generic (PLEG): container finished" podID="f962c90c-c114-413c-804e-999d1b936b65" containerID="c24c104790aa2954664ca3aedaefe267a8358a3abd39a0c1080aac720717cbaf" exitCode=143 Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.445741 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f962c90c-c114-413c-804e-999d1b936b65","Type":"ContainerDied","Data":"c24c104790aa2954664ca3aedaefe267a8358a3abd39a0c1080aac720717cbaf"} Dec 03 12:49:45 crc kubenswrapper[4711]: I1203 12:49:45.468232 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nvcfc" podStartSLOduration=3.016305682 podStartE2EDuration="7.468215715s" podCreationTimestamp="2025-12-03 12:49:38 +0000 UTC" firstStartedPulling="2025-12-03 12:49:40.377481294 +0000 UTC m=+2099.046732549" lastFinishedPulling="2025-12-03 12:49:44.829391327 +0000 UTC m=+2103.498642582" observedRunningTime="2025-12-03 12:49:45.46287825 +0000 UTC m=+2104.132129515" watchObservedRunningTime="2025-12-03 12:49:45.468215715 +0000 UTC m=+2104.137466970" Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.455244 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrhrt" event={"ID":"4a738f0e-1cb5-49ee-9b12-999813383fff","Type":"ContainerStarted","Data":"c04f7cc4725af33d4d76fc2208377c2c6c0cbdb9be62325dc0841113d2f3468e"} Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.468985 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvsph"] Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.469229 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cvsph" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerName="registry-server" containerID="cri-o://6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f" gracePeriod=2 Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.485751 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zrhrt" podStartSLOduration=3.029936458 podStartE2EDuration="5.485736578s" podCreationTimestamp="2025-12-03 12:49:41 +0000 UTC" firstStartedPulling="2025-12-03 12:49:43.406672225 +0000 UTC m=+2102.075923490" lastFinishedPulling="2025-12-03 12:49:45.862472355 +0000 UTC m=+2104.531723610" observedRunningTime="2025-12-03 12:49:46.480933817 +0000 UTC m=+2105.150185082" watchObservedRunningTime="2025-12-03 12:49:46.485736578 +0000 UTC m=+2105.154987833" Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.899900 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.950962 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-444ff\" (UniqueName: \"kubernetes.io/projected/aab1c8ee-fd97-49f0-8376-67af38c710f3-kube-api-access-444ff\") pod \"aab1c8ee-fd97-49f0-8376-67af38c710f3\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.951179 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-catalog-content\") pod \"aab1c8ee-fd97-49f0-8376-67af38c710f3\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.951380 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-utilities\") pod \"aab1c8ee-fd97-49f0-8376-67af38c710f3\" (UID: \"aab1c8ee-fd97-49f0-8376-67af38c710f3\") " Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.952998 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-utilities" (OuterVolumeSpecName: "utilities") pod "aab1c8ee-fd97-49f0-8376-67af38c710f3" (UID: "aab1c8ee-fd97-49f0-8376-67af38c710f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.957370 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab1c8ee-fd97-49f0-8376-67af38c710f3-kube-api-access-444ff" (OuterVolumeSpecName: "kube-api-access-444ff") pod "aab1c8ee-fd97-49f0-8376-67af38c710f3" (UID: "aab1c8ee-fd97-49f0-8376-67af38c710f3"). InnerVolumeSpecName "kube-api-access-444ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:46 crc kubenswrapper[4711]: I1203 12:49:46.982127 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aab1c8ee-fd97-49f0-8376-67af38c710f3" (UID: "aab1c8ee-fd97-49f0-8376-67af38c710f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.054248 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.054500 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-444ff\" (UniqueName: \"kubernetes.io/projected/aab1c8ee-fd97-49f0-8376-67af38c710f3-kube-api-access-444ff\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.054511 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab1c8ee-fd97-49f0-8376-67af38c710f3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.429162 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.466268 4711 generic.go:334] "Generic (PLEG): container finished" podID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerID="6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f" exitCode=0 Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.466321 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvsph" event={"ID":"aab1c8ee-fd97-49f0-8376-67af38c710f3","Type":"ContainerDied","Data":"6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f"} Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.466345 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvsph" event={"ID":"aab1c8ee-fd97-49f0-8376-67af38c710f3","Type":"ContainerDied","Data":"d8dcc024dd590af54cd63e0cc9852a22a944b8f97157c2c92ce5ef845357095e"} Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.466363 4711 scope.go:117] "RemoveContainer" containerID="6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.466480 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvsph" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.481425 4711 generic.go:334] "Generic (PLEG): container finished" podID="81590464-a00d-470b-86f6-8050913cf609" containerID="71b7db0ba0ed925298d54b5816cc72f3f57405454e4f25a2b49cad157ced4195" exitCode=0 Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.481513 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"81590464-a00d-470b-86f6-8050913cf609","Type":"ContainerDied","Data":"71b7db0ba0ed925298d54b5816cc72f3f57405454e4f25a2b49cad157ced4195"} Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.484588 4711 generic.go:334] "Generic (PLEG): container finished" podID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerID="05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82" exitCode=0 Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.484741 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"e22a45f5-3560-498f-8faa-e31d07aa4d48","Type":"ContainerDied","Data":"05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82"} Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.484807 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"e22a45f5-3560-498f-8faa-e31d07aa4d48","Type":"ContainerDied","Data":"f9621c3eaaa75c0fccc72a3cab3b5099742d580461218279b9942932f7277f35"} Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.484945 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.487100 4711 generic.go:334] "Generic (PLEG): container finished" podID="f962c90c-c114-413c-804e-999d1b936b65" containerID="857ef73900a457ab1fba1313522ab4ce0d259b2ac31b4d9ea1f5be86ef42b1ff" exitCode=0 Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.487846 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f962c90c-c114-413c-804e-999d1b936b65","Type":"ContainerDied","Data":"857ef73900a457ab1fba1313522ab4ce0d259b2ac31b4d9ea1f5be86ef42b1ff"} Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.508587 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvsph"] Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.514119 4711 scope.go:117] "RemoveContainer" containerID="3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.533020 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvsph"] Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.542117 4711 scope.go:117] "RemoveContainer" containerID="dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.542269 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.563071 4711 scope.go:117] "RemoveContainer" containerID="6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f" Dec 03 12:49:47 crc kubenswrapper[4711]: E1203 12:49:47.563569 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f\": container with ID starting with 6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f not found: ID does not exist" containerID="6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.563599 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f"} err="failed to get container status \"6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f\": rpc error: code = NotFound desc = could not find container \"6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f\": container with ID starting with 6639fe151bf786a80021c4b7a3c0eb1deeeb8de61b0d6e4beeb093c6ab627e1f not found: ID does not exist" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.563619 4711 scope.go:117] "RemoveContainer" containerID="3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d" Dec 03 12:49:47 crc kubenswrapper[4711]: E1203 12:49:47.563940 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d\": container with ID starting with 3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d not found: ID does not exist" containerID="3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.563958 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d"} err="failed to get container status \"3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d\": rpc error: code = NotFound desc = could not find container \"3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d\": container with ID starting with 3c4e0e9016099533759e13708e98f919f27232a2a03faaf4d30f8873eee58e8d not found: ID does not exist" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.563971 4711 scope.go:117] "RemoveContainer" containerID="dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751" Dec 03 12:49:47 crc kubenswrapper[4711]: E1203 12:49:47.564489 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751\": container with ID starting with dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751 not found: ID does not exist" containerID="dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.564534 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751"} err="failed to get container status \"dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751\": rpc error: code = NotFound desc = could not find container \"dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751\": container with ID starting with dd907444622a8705a0c22d3e2afeb2bec86610efdb9d2a2316ad6b3d138cf751 not found: ID does not exist" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.564562 4711 scope.go:117] "RemoveContainer" containerID="05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.564878 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-config-data\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.564947 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-logs\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.564988 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-sys\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565042 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-scripts\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565088 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-sys" (OuterVolumeSpecName: "sys") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565144 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-httpd-run\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565174 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-nvme\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565197 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-var-locks-brick\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565231 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-dev\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565254 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-run\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565273 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-iscsi\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565338 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565370 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-dev" (OuterVolumeSpecName: "dev") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565415 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-lib-modules\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565438 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565445 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj9cl\" (UniqueName: \"kubernetes.io/projected/e22a45f5-3560-498f-8faa-e31d07aa4d48-kube-api-access-mj9cl\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565465 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565466 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.565473 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"e22a45f5-3560-498f-8faa-e31d07aa4d48\" (UID: \"e22a45f5-3560-498f-8faa-e31d07aa4d48\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.566244 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.566257 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.566266 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.566278 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.566286 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.567515 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-run" (OuterVolumeSpecName: "run") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.567580 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.569528 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.570025 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-logs" (OuterVolumeSpecName: "logs") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.576200 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.579313 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.580296 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22a45f5-3560-498f-8faa-e31d07aa4d48-kube-api-access-mj9cl" (OuterVolumeSpecName: "kube-api-access-mj9cl") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "kube-api-access-mj9cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.584412 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-scripts" (OuterVolumeSpecName: "scripts") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.613579 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-config-data" (OuterVolumeSpecName: "config-data") pod "e22a45f5-3560-498f-8faa-e31d07aa4d48" (UID: "e22a45f5-3560-498f-8faa-e31d07aa4d48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.666776 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-iscsi\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.666822 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-sys\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.666866 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncf9h\" (UniqueName: \"kubernetes.io/projected/81590464-a00d-470b-86f6-8050913cf609-kube-api-access-ncf9h\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.666889 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-config-data\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.666919 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-scripts\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.666967 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-nvme\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.666983 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-run\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667029 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-lib-modules\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667046 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667095 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-var-locks-brick\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667132 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667148 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-dev\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667201 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-httpd-run\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667201 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667227 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-logs\") pod \"81590464-a00d-470b-86f6-8050913cf609\" (UID: \"81590464-a00d-470b-86f6-8050913cf609\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667247 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-sys" (OuterVolumeSpecName: "sys") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667502 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-run" (OuterVolumeSpecName: "run") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667557 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667602 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667618 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667628 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667637 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667637 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667646 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667686 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-dev" (OuterVolumeSpecName: "dev") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667710 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667731 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e22a45f5-3560-498f-8faa-e31d07aa4d48-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667749 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj9cl\" (UniqueName: \"kubernetes.io/projected/e22a45f5-3560-498f-8faa-e31d07aa4d48-kube-api-access-mj9cl\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667771 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667769 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667784 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667766 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667798 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22a45f5-3560-498f-8faa-e31d07aa4d48-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667854 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22a45f5-3560-498f-8faa-e31d07aa4d48-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.667975 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-logs" (OuterVolumeSpecName: "logs") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.670870 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81590464-a00d-470b-86f6-8050913cf609-kube-api-access-ncf9h" (OuterVolumeSpecName: "kube-api-access-ncf9h") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "kube-api-access-ncf9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.671669 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.672117 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-scripts" (OuterVolumeSpecName: "scripts") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.672861 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.674682 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.684190 4711 scope.go:117] "RemoveContainer" containerID="6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.685039 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.695999 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.714799 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-config-data" (OuterVolumeSpecName: "config-data") pod "81590464-a00d-470b-86f6-8050913cf609" (UID: "81590464-a00d-470b-86f6-8050913cf609"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769024 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-lib-modules\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769366 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-logs\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769154 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769429 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769475 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-iscsi\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769503 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-var-locks-brick\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769525 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-run\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769589 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-scripts\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769623 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769643 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-run" (OuterVolumeSpecName: "run") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769654 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-dev\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769698 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769703 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769738 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-dev" (OuterVolumeSpecName: "dev") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769761 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-logs" (OuterVolumeSpecName: "logs") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769780 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbgg9\" (UniqueName: \"kubernetes.io/projected/f962c90c-c114-413c-804e-999d1b936b65-kube-api-access-gbgg9\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769813 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-config-data\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769848 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-nvme\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769864 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-sys\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.769893 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-httpd-run\") pod \"f962c90c-c114-413c-804e-999d1b936b65\" (UID: \"f962c90c-c114-413c-804e-999d1b936b65\") " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770153 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770405 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770438 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-sys" (OuterVolumeSpecName: "sys") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770612 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770638 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770649 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770657 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770667 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770675 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770683 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770694 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770701 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f962c90c-c114-413c-804e-999d1b936b65-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770709 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770722 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770730 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770739 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770747 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770755 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770762 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770771 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81590464-a00d-470b-86f6-8050913cf609-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770778 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f962c90c-c114-413c-804e-999d1b936b65-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770787 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncf9h\" (UniqueName: \"kubernetes.io/projected/81590464-a00d-470b-86f6-8050913cf609-kube-api-access-ncf9h\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770795 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770803 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81590464-a00d-470b-86f6-8050913cf609-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.770811 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81590464-a00d-470b-86f6-8050913cf609-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.773615 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.773660 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f962c90c-c114-413c-804e-999d1b936b65-kube-api-access-gbgg9" (OuterVolumeSpecName: "kube-api-access-gbgg9") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "kube-api-access-gbgg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.773645 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.775711 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-scripts" (OuterVolumeSpecName: "scripts") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.783400 4711 scope.go:117] "RemoveContainer" containerID="05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82" Dec 03 12:49:47 crc kubenswrapper[4711]: E1203 12:49:47.783885 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82\": container with ID starting with 05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82 not found: ID does not exist" containerID="05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.784155 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82"} err="failed to get container status \"05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82\": rpc error: code = NotFound desc = could not find container \"05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82\": container with ID starting with 05867f09a678af61853f73fa1db7466019970c4e55ba134a6a780f30fb150a82 not found: ID does not exist" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.784189 4711 scope.go:117] "RemoveContainer" containerID="6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e" Dec 03 12:49:47 crc kubenswrapper[4711]: E1203 12:49:47.784595 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e\": container with ID starting with 6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e not found: ID does not exist" containerID="6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.784624 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e"} err="failed to get container status \"6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e\": rpc error: code = NotFound desc = could not find container \"6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e\": container with ID starting with 6ca23a7e884f5481496af68715e2bf40633787a48f2fac2cc0680d9dabfab15e not found: ID does not exist" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.787936 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.800313 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.831727 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" path="/var/lib/kubelet/pods/aab1c8ee-fd97-49f0-8376-67af38c710f3/volumes" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.833239 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-config-data" (OuterVolumeSpecName: "config-data") pod "f962c90c-c114-413c-804e-999d1b936b65" (UID: "f962c90c-c114-413c-804e-999d1b936b65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.836239 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.838666 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.872693 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.872729 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.872746 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.872760 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbgg9\" (UniqueName: \"kubernetes.io/projected/f962c90c-c114-413c-804e-999d1b936b65-kube-api-access-gbgg9\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.872773 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.872784 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f962c90c-c114-413c-804e-999d1b936b65-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.872796 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.886920 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.888117 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.979144 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:47 crc kubenswrapper[4711]: I1203 12:49:47.979192 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.049477 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.182640 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-logs\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.182729 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-scripts\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.182774 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-dev\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.182832 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-var-locks-brick\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.182868 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-httpd-run\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.182967 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c92f\" (UniqueName: \"kubernetes.io/projected/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-kube-api-access-8c92f\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183041 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183092 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-logs" (OuterVolumeSpecName: "logs") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183106 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-config-data\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183210 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-iscsi\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183272 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-nvme\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183328 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-lib-modules\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183347 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183383 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-sys\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183404 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-run\") pod \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\" (UID: \"1c1ad65c-d3ad-499e-a794-0d482cbe7df9\") " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183561 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183632 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-run" (OuterVolumeSpecName: "run") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183656 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183730 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.183722 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184131 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184150 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184159 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184169 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184177 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184186 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184215 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-sys" (OuterVolumeSpecName: "sys") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184370 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.184396 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-dev" (OuterVolumeSpecName: "dev") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.186498 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-scripts" (OuterVolumeSpecName: "scripts") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.186979 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-kube-api-access-8c92f" (OuterVolumeSpecName: "kube-api-access-8c92f") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "kube-api-access-8c92f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.188685 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance-cache") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.188765 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.231464 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-config-data" (OuterVolumeSpecName: "config-data") pod "1c1ad65c-d3ad-499e-a794-0d482cbe7df9" (UID: "1c1ad65c-d3ad-499e-a794-0d482cbe7df9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.285458 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.285510 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.285531 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.285547 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.285564 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.285581 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c92f\" (UniqueName: \"kubernetes.io/projected/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-kube-api-access-8c92f\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.285619 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.285636 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c1ad65c-d3ad-499e-a794-0d482cbe7df9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.301055 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.319189 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.387549 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.387585 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.496042 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f962c90c-c114-413c-804e-999d1b936b65","Type":"ContainerDied","Data":"1f0467aa6a2211d8f905febcced419c5edeb852ca05efc8a4e6232da0e0ea7b6"} Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.496102 4711 scope.go:117] "RemoveContainer" containerID="857ef73900a457ab1fba1313522ab4ce0d259b2ac31b4d9ea1f5be86ef42b1ff" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.496113 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.498847 4711 generic.go:334] "Generic (PLEG): container finished" podID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerID="33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c" exitCode=0 Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.498922 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.499032 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1c1ad65c-d3ad-499e-a794-0d482cbe7df9","Type":"ContainerDied","Data":"33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c"} Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.499123 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1c1ad65c-d3ad-499e-a794-0d482cbe7df9","Type":"ContainerDied","Data":"66921293016b419659e6dc70c6cbae30f53c58a2684e560791311950834290a0"} Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.501330 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"81590464-a00d-470b-86f6-8050913cf609","Type":"ContainerDied","Data":"ac695d207b75fa0940061679c874e22a459034d3b8a7bbfae8ac0f921e91f331"} Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.501649 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.528415 4711 scope.go:117] "RemoveContainer" containerID="c24c104790aa2954664ca3aedaefe267a8358a3abd39a0c1080aac720717cbaf" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.532498 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.545398 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.566715 4711 scope.go:117] "RemoveContainer" containerID="33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.567384 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.573447 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.580043 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.586670 4711 scope.go:117] "RemoveContainer" containerID="4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.587004 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.628347 4711 scope.go:117] "RemoveContainer" containerID="33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c" Dec 03 12:49:48 crc kubenswrapper[4711]: E1203 12:49:48.628791 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c\": container with ID starting with 33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c not found: ID does not exist" containerID="33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.628830 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c"} err="failed to get container status \"33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c\": rpc error: code = NotFound desc = could not find container \"33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c\": container with ID starting with 33f43703e13b13804a5ff21a839fb6313a2e4ce976695abca2754529ff92197c not found: ID does not exist" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.628855 4711 scope.go:117] "RemoveContainer" containerID="4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706" Dec 03 12:49:48 crc kubenswrapper[4711]: E1203 12:49:48.629243 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706\": container with ID starting with 4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706 not found: ID does not exist" containerID="4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.629270 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706"} err="failed to get container status \"4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706\": rpc error: code = NotFound desc = could not find container \"4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706\": container with ID starting with 4bd7ed5019f716c1ab5d65b59703a65b9d243e47ccf333493586dd4c545da706 not found: ID does not exist" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.629288 4711 scope.go:117] "RemoveContainer" containerID="71b7db0ba0ed925298d54b5816cc72f3f57405454e4f25a2b49cad157ced4195" Dec 03 12:49:48 crc kubenswrapper[4711]: I1203 12:49:48.654353 4711 scope.go:117] "RemoveContainer" containerID="03d7f7bc9ade46a3241c9d6dfa6fc3c01460d82e553458a970076d7715bbefe7" Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.104156 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.104251 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.165642 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.558306 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.558768 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerName="glance-log" containerID="cri-o://9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09" gracePeriod=30 Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.559003 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerName="glance-httpd" containerID="cri-o://6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa" gracePeriod=30 Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.596658 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.825431 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" path="/var/lib/kubelet/pods/1c1ad65c-d3ad-499e-a794-0d482cbe7df9/volumes" Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.826285 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81590464-a00d-470b-86f6-8050913cf609" path="/var/lib/kubelet/pods/81590464-a00d-470b-86f6-8050913cf609/volumes" Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.827044 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" path="/var/lib/kubelet/pods/e22a45f5-3560-498f-8faa-e31d07aa4d48/volumes" Dec 03 12:49:49 crc kubenswrapper[4711]: I1203 12:49:49.828342 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f962c90c-c114-413c-804e-999d1b936b65" path="/var/lib/kubelet/pods/f962c90c-c114-413c-804e-999d1b936b65/volumes" Dec 03 12:49:50 crc kubenswrapper[4711]: I1203 12:49:50.046739 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:50 crc kubenswrapper[4711]: I1203 12:49:50.047056 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerName="glance-log" containerID="cri-o://307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7" gracePeriod=30 Dec 03 12:49:50 crc kubenswrapper[4711]: I1203 12:49:50.047163 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerName="glance-httpd" containerID="cri-o://bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba" gracePeriod=30 Dec 03 12:49:50 crc kubenswrapper[4711]: I1203 12:49:50.536132 4711 generic.go:334] "Generic (PLEG): container finished" podID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerID="9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09" exitCode=143 Dec 03 12:49:50 crc kubenswrapper[4711]: I1203 12:49:50.536302 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ae7e3c8c-8b3d-46de-8359-c33d175209e7","Type":"ContainerDied","Data":"9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09"} Dec 03 12:49:50 crc kubenswrapper[4711]: I1203 12:49:50.539055 4711 generic.go:334] "Generic (PLEG): container finished" podID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerID="307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7" exitCode=143 Dec 03 12:49:50 crc kubenswrapper[4711]: I1203 12:49:50.539246 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c","Type":"ContainerDied","Data":"307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7"} Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.266603 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nvcfc"] Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.394528 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.394604 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.444167 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.547312 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nvcfc" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerName="registry-server" containerID="cri-o://b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca" gracePeriod=2 Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.592701 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.904386 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.965572 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-utilities\") pod \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.965649 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-catalog-content\") pod \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.965765 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f5c9\" (UniqueName: \"kubernetes.io/projected/ff7b9727-150f-4ee5-b22b-3147877bdaa6-kube-api-access-6f5c9\") pod \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\" (UID: \"ff7b9727-150f-4ee5-b22b-3147877bdaa6\") " Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.967760 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-utilities" (OuterVolumeSpecName: "utilities") pod "ff7b9727-150f-4ee5-b22b-3147877bdaa6" (UID: "ff7b9727-150f-4ee5-b22b-3147877bdaa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:51 crc kubenswrapper[4711]: I1203 12:49:51.974724 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7b9727-150f-4ee5-b22b-3147877bdaa6-kube-api-access-6f5c9" (OuterVolumeSpecName: "kube-api-access-6f5c9") pod "ff7b9727-150f-4ee5-b22b-3147877bdaa6" (UID: "ff7b9727-150f-4ee5-b22b-3147877bdaa6"). InnerVolumeSpecName "kube-api-access-6f5c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.015609 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff7b9727-150f-4ee5-b22b-3147877bdaa6" (UID: "ff7b9727-150f-4ee5-b22b-3147877bdaa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.067090 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.067118 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7b9727-150f-4ee5-b22b-3147877bdaa6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.067129 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f5c9\" (UniqueName: \"kubernetes.io/projected/ff7b9727-150f-4ee5-b22b-3147877bdaa6-kube-api-access-6f5c9\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.561412 4711 generic.go:334] "Generic (PLEG): container finished" podID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerID="b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca" exitCode=0 Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.561475 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcfc" event={"ID":"ff7b9727-150f-4ee5-b22b-3147877bdaa6","Type":"ContainerDied","Data":"b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca"} Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.561523 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcfc" event={"ID":"ff7b9727-150f-4ee5-b22b-3147877bdaa6","Type":"ContainerDied","Data":"b893e089a750a3a0b9d8fbabd8aa59b4e0e43e30bc8d0d3ae2813ba74b9420c6"} Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.561545 4711 scope.go:117] "RemoveContainer" containerID="b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.562551 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvcfc" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.582033 4711 scope.go:117] "RemoveContainer" containerID="5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.609651 4711 scope.go:117] "RemoveContainer" containerID="0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.623985 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nvcfc"] Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.637486 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nvcfc"] Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.662885 4711 scope.go:117] "RemoveContainer" containerID="b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca" Dec 03 12:49:52 crc kubenswrapper[4711]: E1203 12:49:52.667093 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca\": container with ID starting with b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca not found: ID does not exist" containerID="b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.667341 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca"} err="failed to get container status \"b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca\": rpc error: code = NotFound desc = could not find container \"b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca\": container with ID starting with b59f4aaaa98e0ec9d11cb1211ad6f210cf0a6b462ff2f6f6882add408328f6ca not found: ID does not exist" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.667598 4711 scope.go:117] "RemoveContainer" containerID="5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745" Dec 03 12:49:52 crc kubenswrapper[4711]: E1203 12:49:52.668622 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745\": container with ID starting with 5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745 not found: ID does not exist" containerID="5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.668690 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745"} err="failed to get container status \"5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745\": rpc error: code = NotFound desc = could not find container \"5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745\": container with ID starting with 5c60d03e10a0a2b04ed188bef5ea3ef704d6137b4291e6365947fa739b50f745 not found: ID does not exist" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.668729 4711 scope.go:117] "RemoveContainer" containerID="0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683" Dec 03 12:49:52 crc kubenswrapper[4711]: E1203 12:49:52.669313 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683\": container with ID starting with 0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683 not found: ID does not exist" containerID="0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683" Dec 03 12:49:52 crc kubenswrapper[4711]: I1203 12:49:52.669367 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683"} err="failed to get container status \"0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683\": rpc error: code = NotFound desc = could not find container \"0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683\": container with ID starting with 0d590ac250f716d313dcc2d5ac96a3b788cd92009296bc442d3ae8f6ad7bf683 not found: ID does not exist" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.136860 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.283874 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-dev\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284254 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-httpd-run\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284280 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-var-locks-brick\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284310 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-run\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284336 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-config-data\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284375 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284395 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-scripts\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284422 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-lib-modules\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284446 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-iscsi\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284493 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-sys\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284521 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddfhc\" (UniqueName: \"kubernetes.io/projected/ae7e3c8c-8b3d-46de-8359-c33d175209e7-kube-api-access-ddfhc\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284543 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-nvme\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284565 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-logs\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.284609 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\" (UID: \"ae7e3c8c-8b3d-46de-8359-c33d175209e7\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.285624 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.285633 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-sys" (OuterVolumeSpecName: "sys") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.285667 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-dev" (OuterVolumeSpecName: "dev") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.285705 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.285736 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.285842 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.285875 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.286738 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-logs" (OuterVolumeSpecName: "logs") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.289209 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-run" (OuterVolumeSpecName: "run") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.292630 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-scripts" (OuterVolumeSpecName: "scripts") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.292757 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.293991 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7e3c8c-8b3d-46de-8359-c33d175209e7-kube-api-access-ddfhc" (OuterVolumeSpecName: "kube-api-access-ddfhc") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "kube-api-access-ddfhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.296192 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.348197 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-config-data" (OuterVolumeSpecName: "config-data") pod "ae7e3c8c-8b3d-46de-8359-c33d175209e7" (UID: "ae7e3c8c-8b3d-46de-8359-c33d175209e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.386009 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.386042 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.386087 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.386098 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7e3c8c-8b3d-46de-8359-c33d175209e7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.386107 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.386115 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.386123 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.389025 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddfhc\" (UniqueName: \"kubernetes.io/projected/ae7e3c8c-8b3d-46de-8359-c33d175209e7-kube-api-access-ddfhc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.389078 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.389095 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.389139 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.389156 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.389167 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae7e3c8c-8b3d-46de-8359-c33d175209e7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.389180 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ae7e3c8c-8b3d-46de-8359-c33d175209e7-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.400217 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.421386 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.490851 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.490879 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.498073 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.579459 4711 generic.go:334] "Generic (PLEG): container finished" podID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerID="6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa" exitCode=0 Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.579529 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ae7e3c8c-8b3d-46de-8359-c33d175209e7","Type":"ContainerDied","Data":"6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa"} Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.579555 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ae7e3c8c-8b3d-46de-8359-c33d175209e7","Type":"ContainerDied","Data":"68c329d5602a09881a0239bbd20b701898d1d6687559040c8711914159eccb84"} Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.579575 4711 scope.go:117] "RemoveContainer" containerID="6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.579691 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.583825 4711 generic.go:334] "Generic (PLEG): container finished" podID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerID="bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba" exitCode=0 Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.583931 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c","Type":"ContainerDied","Data":"bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba"} Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.583961 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c","Type":"ContainerDied","Data":"ed33e328d018aac630d6c557d0ab55b44979cece8ace2aa2ad8f58acd1f3fff1"} Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.584027 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591168 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-dev\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591203 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-config-data\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591233 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-iscsi\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591258 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-var-locks-brick\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591281 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591303 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-httpd-run\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591344 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-scripts\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591378 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591397 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnqkp\" (UniqueName: \"kubernetes.io/projected/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-kube-api-access-jnqkp\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591442 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-logs\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591469 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-nvme\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591515 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-run\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591537 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-sys\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591564 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-lib-modules\") pod \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\" (UID: \"4bfce555-bbde-4b58-9fcc-13b1e2db8a3c\") " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591273 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-dev" (OuterVolumeSpecName: "dev") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591298 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591681 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591701 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591867 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591895 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-run" (OuterVolumeSpecName: "run") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591925 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-sys" (OuterVolumeSpecName: "sys") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.591944 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.592116 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.592143 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.592200 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.592211 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.592219 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.592228 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.592235 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.592245 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.594520 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.595108 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-logs" (OuterVolumeSpecName: "logs") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.595303 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-kube-api-access-jnqkp" (OuterVolumeSpecName: "kube-api-access-jnqkp") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "kube-api-access-jnqkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.596772 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.596844 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-scripts" (OuterVolumeSpecName: "scripts") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.604851 4711 scope.go:117] "RemoveContainer" containerID="9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.617041 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.623364 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.629659 4711 scope.go:117] "RemoveContainer" containerID="6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa" Dec 03 12:49:53 crc kubenswrapper[4711]: E1203 12:49:53.631413 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa\": container with ID starting with 6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa not found: ID does not exist" containerID="6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.631676 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa"} err="failed to get container status \"6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa\": rpc error: code = NotFound desc = could not find container \"6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa\": container with ID starting with 6c4faba728e6f7de002f6e67aadf9a6ce3ca606a520131d2f30be162e184b7aa not found: ID does not exist" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.631822 4711 scope.go:117] "RemoveContainer" containerID="9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09" Dec 03 12:49:53 crc kubenswrapper[4711]: E1203 12:49:53.632277 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09\": container with ID starting with 9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09 not found: ID does not exist" containerID="9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.632308 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09"} err="failed to get container status \"9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09\": rpc error: code = NotFound desc = could not find container \"9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09\": container with ID starting with 9fe93a020d7261971c43f64fc162e597174a7c5f840ebbce678b618beff6ea09 not found: ID does not exist" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.632333 4711 scope.go:117] "RemoveContainer" containerID="bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.643460 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-config-data" (OuterVolumeSpecName: "config-data") pod "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" (UID: "4bfce555-bbde-4b58-9fcc-13b1e2db8a3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.651834 4711 scope.go:117] "RemoveContainer" containerID="307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.694106 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.694135 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.694164 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.694175 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.694188 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.694197 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnqkp\" (UniqueName: \"kubernetes.io/projected/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c-kube-api-access-jnqkp\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.708928 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.714633 4711 scope.go:117] "RemoveContainer" containerID="bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba" Dec 03 12:49:53 crc kubenswrapper[4711]: E1203 12:49:53.715217 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba\": container with ID starting with bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba not found: ID does not exist" containerID="bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.715289 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba"} err="failed to get container status \"bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba\": rpc error: code = NotFound desc = could not find container \"bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba\": container with ID starting with bd9f23931c906d663b7cc8407e6555b6738ae66eaeff0ecb1cf7a82121351cba not found: ID does not exist" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.715334 4711 scope.go:117] "RemoveContainer" containerID="307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7" Dec 03 12:49:53 crc kubenswrapper[4711]: E1203 12:49:53.715776 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7\": container with ID starting with 307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7 not found: ID does not exist" containerID="307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.715823 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7"} err="failed to get container status \"307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7\": rpc error: code = NotFound desc = could not find container \"307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7\": container with ID starting with 307aeba66bd25e6f7c2f58ca07a49899392bd19e5a59f975825341a383b4b8d7 not found: ID does not exist" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.716036 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.795895 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.795951 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.825831 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" path="/var/lib/kubelet/pods/ae7e3c8c-8b3d-46de-8359-c33d175209e7/volumes" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.826982 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" path="/var/lib/kubelet/pods/ff7b9727-150f-4ee5-b22b-3147877bdaa6/volumes" Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.908023 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:53 crc kubenswrapper[4711]: I1203 12:49:53.915197 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.778842 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ht7xq"] Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.786316 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ht7xq"] Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.817854 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancebe2b-account-delete-b6qvj"] Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818206 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerName="extract-utilities" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818230 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerName="extract-utilities" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818244 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818253 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818267 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818276 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818289 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81590464-a00d-470b-86f6-8050913cf609" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818296 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="81590464-a00d-470b-86f6-8050913cf609" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818313 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818321 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818330 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818337 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818354 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818364 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818372 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f962c90c-c114-413c-804e-999d1b936b65" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818382 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962c90c-c114-413c-804e-999d1b936b65" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818398 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818408 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818419 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818426 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818440 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81590464-a00d-470b-86f6-8050913cf609" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818464 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="81590464-a00d-470b-86f6-8050913cf609" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818475 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818484 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818497 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerName="extract-content" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818505 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerName="extract-content" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818520 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerName="extract-utilities" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818528 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerName="extract-utilities" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818539 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818546 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818560 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f962c90c-c114-413c-804e-999d1b936b65" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818568 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962c90c-c114-413c-804e-999d1b936b65" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818584 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818591 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: E1203 12:49:54.818601 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerName="extract-content" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818608 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerName="extract-content" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818789 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f962c90c-c114-413c-804e-999d1b936b65" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818807 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818830 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818842 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="81590464-a00d-470b-86f6-8050913cf609" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818881 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7b9727-150f-4ee5-b22b-3147877bdaa6" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818891 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818900 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="f962c90c-c114-413c-804e-999d1b936b65" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818926 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818938 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab1c8ee-fd97-49f0-8376-67af38c710f3" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818947 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7e3c8c-8b3d-46de-8359-c33d175209e7" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818957 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22a45f5-3560-498f-8faa-e31d07aa4d48" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818971 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="81590464-a00d-470b-86f6-8050913cf609" containerName="glance-log" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818986 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.818995 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1ad65c-d3ad-499e-a794-0d482cbe7df9" containerName="glance-httpd" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.819568 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.825706 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancebe2b-account-delete-b6qvj"] Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.914412 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-operator-scripts\") pod \"glancebe2b-account-delete-b6qvj\" (UID: \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\") " pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:54 crc kubenswrapper[4711]: I1203 12:49:54.914465 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzckh\" (UniqueName: \"kubernetes.io/projected/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-kube-api-access-rzckh\") pod \"glancebe2b-account-delete-b6qvj\" (UID: \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\") " pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:55 crc kubenswrapper[4711]: I1203 12:49:55.016006 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-operator-scripts\") pod \"glancebe2b-account-delete-b6qvj\" (UID: \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\") " pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:55 crc kubenswrapper[4711]: I1203 12:49:55.016059 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzckh\" (UniqueName: \"kubernetes.io/projected/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-kube-api-access-rzckh\") pod \"glancebe2b-account-delete-b6qvj\" (UID: \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\") " pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:55 crc kubenswrapper[4711]: I1203 12:49:55.016816 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-operator-scripts\") pod \"glancebe2b-account-delete-b6qvj\" (UID: \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\") " pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:55 crc kubenswrapper[4711]: I1203 12:49:55.060972 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzckh\" (UniqueName: \"kubernetes.io/projected/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-kube-api-access-rzckh\") pod \"glancebe2b-account-delete-b6qvj\" (UID: \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\") " pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:55 crc kubenswrapper[4711]: I1203 12:49:55.135566 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:55 crc kubenswrapper[4711]: I1203 12:49:55.597169 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancebe2b-account-delete-b6qvj"] Dec 03 12:49:55 crc kubenswrapper[4711]: I1203 12:49:55.826884 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bfce555-bbde-4b58-9fcc-13b1e2db8a3c" path="/var/lib/kubelet/pods/4bfce555-bbde-4b58-9fcc-13b1e2db8a3c/volumes" Dec 03 12:49:55 crc kubenswrapper[4711]: I1203 12:49:55.827933 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864de37c-1231-4155-988a-14b4bcb9c3aa" path="/var/lib/kubelet/pods/864de37c-1231-4155-988a-14b4bcb9c3aa/volumes" Dec 03 12:49:56 crc kubenswrapper[4711]: I1203 12:49:56.613610 4711 generic.go:334] "Generic (PLEG): container finished" podID="ca20b9e6-a4ba-4ede-a898-367b7ec51f03" containerID="69bf6bbc0e3b8b2d55660972252e816798aee39d3a5ddc6eb78a284efd29721c" exitCode=0 Dec 03 12:49:56 crc kubenswrapper[4711]: I1203 12:49:56.613719 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" event={"ID":"ca20b9e6-a4ba-4ede-a898-367b7ec51f03","Type":"ContainerDied","Data":"69bf6bbc0e3b8b2d55660972252e816798aee39d3a5ddc6eb78a284efd29721c"} Dec 03 12:49:56 crc kubenswrapper[4711]: I1203 12:49:56.613899 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" event={"ID":"ca20b9e6-a4ba-4ede-a898-367b7ec51f03","Type":"ContainerStarted","Data":"8406b8d426daf06b25dcc67a3f9b4a81f16d6074ac2ed21e2a681838b587b45b"} Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.266238 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrhrt"] Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.267371 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zrhrt" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerName="registry-server" containerID="cri-o://c04f7cc4725af33d4d76fc2208377c2c6c0cbdb9be62325dc0841113d2f3468e" gracePeriod=2 Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.626232 4711 generic.go:334] "Generic (PLEG): container finished" podID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerID="c04f7cc4725af33d4d76fc2208377c2c6c0cbdb9be62325dc0841113d2f3468e" exitCode=0 Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.626304 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrhrt" event={"ID":"4a738f0e-1cb5-49ee-9b12-999813383fff","Type":"ContainerDied","Data":"c04f7cc4725af33d4d76fc2208377c2c6c0cbdb9be62325dc0841113d2f3468e"} Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.897456 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.971055 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-operator-scripts\") pod \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\" (UID: \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\") " Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.971107 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzckh\" (UniqueName: \"kubernetes.io/projected/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-kube-api-access-rzckh\") pod \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\" (UID: \"ca20b9e6-a4ba-4ede-a898-367b7ec51f03\") " Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.971781 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca20b9e6-a4ba-4ede-a898-367b7ec51f03" (UID: "ca20b9e6-a4ba-4ede-a898-367b7ec51f03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:49:57 crc kubenswrapper[4711]: I1203 12:49:57.981593 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-kube-api-access-rzckh" (OuterVolumeSpecName: "kube-api-access-rzckh") pod "ca20b9e6-a4ba-4ede-a898-367b7ec51f03" (UID: "ca20b9e6-a4ba-4ede-a898-367b7ec51f03"). InnerVolumeSpecName "kube-api-access-rzckh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.072786 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.072825 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzckh\" (UniqueName: \"kubernetes.io/projected/ca20b9e6-a4ba-4ede-a898-367b7ec51f03-kube-api-access-rzckh\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.184803 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.275709 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-catalog-content\") pod \"4a738f0e-1cb5-49ee-9b12-999813383fff\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.275790 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-utilities\") pod \"4a738f0e-1cb5-49ee-9b12-999813383fff\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.275833 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5xmb\" (UniqueName: \"kubernetes.io/projected/4a738f0e-1cb5-49ee-9b12-999813383fff-kube-api-access-d5xmb\") pod \"4a738f0e-1cb5-49ee-9b12-999813383fff\" (UID: \"4a738f0e-1cb5-49ee-9b12-999813383fff\") " Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.276674 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-utilities" (OuterVolumeSpecName: "utilities") pod "4a738f0e-1cb5-49ee-9b12-999813383fff" (UID: "4a738f0e-1cb5-49ee-9b12-999813383fff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.279418 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a738f0e-1cb5-49ee-9b12-999813383fff-kube-api-access-d5xmb" (OuterVolumeSpecName: "kube-api-access-d5xmb") pod "4a738f0e-1cb5-49ee-9b12-999813383fff" (UID: "4a738f0e-1cb5-49ee-9b12-999813383fff"). InnerVolumeSpecName "kube-api-access-d5xmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.330229 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a738f0e-1cb5-49ee-9b12-999813383fff" (UID: "4a738f0e-1cb5-49ee-9b12-999813383fff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.377984 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.378033 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5xmb\" (UniqueName: \"kubernetes.io/projected/4a738f0e-1cb5-49ee-9b12-999813383fff-kube-api-access-d5xmb\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.378053 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a738f0e-1cb5-49ee-9b12-999813383fff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.636168 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" event={"ID":"ca20b9e6-a4ba-4ede-a898-367b7ec51f03","Type":"ContainerDied","Data":"8406b8d426daf06b25dcc67a3f9b4a81f16d6074ac2ed21e2a681838b587b45b"} Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.636234 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8406b8d426daf06b25dcc67a3f9b4a81f16d6074ac2ed21e2a681838b587b45b" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.636185 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebe2b-account-delete-b6qvj" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.638299 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrhrt" event={"ID":"4a738f0e-1cb5-49ee-9b12-999813383fff","Type":"ContainerDied","Data":"a426bb585044ada421c358ea2886fb608419eddf18f6e1f6ff92a59817aeaa2a"} Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.638359 4711 scope.go:117] "RemoveContainer" containerID="c04f7cc4725af33d4d76fc2208377c2c6c0cbdb9be62325dc0841113d2f3468e" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.638386 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrhrt" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.659340 4711 scope.go:117] "RemoveContainer" containerID="b91a16cf3c5cd00790b34dd0928c60efb70847fdc5750486cd97cdeef3063351" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.681167 4711 scope.go:117] "RemoveContainer" containerID="997b9c3965dbe47f13ab228345e9fe8907bfafa6396e9329647eb0766916e01a" Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.687238 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrhrt"] Dec 03 12:49:58 crc kubenswrapper[4711]: I1203 12:49:58.694409 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zrhrt"] Dec 03 12:49:59 crc kubenswrapper[4711]: I1203 12:49:59.827375 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" path="/var/lib/kubelet/pods/4a738f0e-1cb5-49ee-9b12-999813383fff/volumes" Dec 03 12:49:59 crc kubenswrapper[4711]: I1203 12:49:59.837649 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-ks97b"] Dec 03 12:49:59 crc kubenswrapper[4711]: I1203 12:49:59.844515 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-ks97b"] Dec 03 12:49:59 crc kubenswrapper[4711]: I1203 12:49:59.854700 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancebe2b-account-delete-b6qvj"] Dec 03 12:49:59 crc kubenswrapper[4711]: I1203 12:49:59.860620 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancebe2b-account-delete-b6qvj"] Dec 03 12:49:59 crc kubenswrapper[4711]: I1203 12:49:59.865602 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-be2b-account-create-update-fw7h5"] Dec 03 12:49:59 crc kubenswrapper[4711]: I1203 12:49:59.870805 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-be2b-account-create-update-fw7h5"] Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.562578 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-lbmrn"] Dec 03 12:50:00 crc kubenswrapper[4711]: E1203 12:50:00.562993 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca20b9e6-a4ba-4ede-a898-367b7ec51f03" containerName="mariadb-account-delete" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.563021 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca20b9e6-a4ba-4ede-a898-367b7ec51f03" containerName="mariadb-account-delete" Dec 03 12:50:00 crc kubenswrapper[4711]: E1203 12:50:00.563065 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerName="registry-server" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.563074 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerName="registry-server" Dec 03 12:50:00 crc kubenswrapper[4711]: E1203 12:50:00.563085 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerName="extract-content" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.563093 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerName="extract-content" Dec 03 12:50:00 crc kubenswrapper[4711]: E1203 12:50:00.563102 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerName="extract-utilities" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.563110 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerName="extract-utilities" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.563249 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca20b9e6-a4ba-4ede-a898-367b7ec51f03" containerName="mariadb-account-delete" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.563276 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a738f0e-1cb5-49ee-9b12-999813383fff" containerName="registry-server" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.563848 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.569272 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-lbmrn"] Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.665881 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-8237-account-create-update-48lr9"] Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.666861 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.670368 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.674414 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-8237-account-create-update-48lr9"] Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.713275 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7lr\" (UniqueName: \"kubernetes.io/projected/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-kube-api-access-gn7lr\") pod \"glance-db-create-lbmrn\" (UID: \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\") " pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.713402 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-operator-scripts\") pod \"glance-db-create-lbmrn\" (UID: \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\") " pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.815462 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7lr\" (UniqueName: \"kubernetes.io/projected/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-kube-api-access-gn7lr\") pod \"glance-db-create-lbmrn\" (UID: \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\") " pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.815556 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-operator-scripts\") pod \"glance-db-create-lbmrn\" (UID: \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\") " pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.815600 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md8gj\" (UniqueName: \"kubernetes.io/projected/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-kube-api-access-md8gj\") pod \"glance-8237-account-create-update-48lr9\" (UID: \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\") " pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.815661 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-operator-scripts\") pod \"glance-8237-account-create-update-48lr9\" (UID: \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\") " pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.816362 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-operator-scripts\") pod \"glance-db-create-lbmrn\" (UID: \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\") " pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.833540 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7lr\" (UniqueName: \"kubernetes.io/projected/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-kube-api-access-gn7lr\") pod \"glance-db-create-lbmrn\" (UID: \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\") " pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.891124 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.916945 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-operator-scripts\") pod \"glance-8237-account-create-update-48lr9\" (UID: \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\") " pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.917150 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md8gj\" (UniqueName: \"kubernetes.io/projected/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-kube-api-access-md8gj\") pod \"glance-8237-account-create-update-48lr9\" (UID: \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\") " pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.918499 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-operator-scripts\") pod \"glance-8237-account-create-update-48lr9\" (UID: \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\") " pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.949833 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md8gj\" (UniqueName: \"kubernetes.io/projected/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-kube-api-access-md8gj\") pod \"glance-8237-account-create-update-48lr9\" (UID: \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\") " pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:00 crc kubenswrapper[4711]: I1203 12:50:00.987178 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:01 crc kubenswrapper[4711]: I1203 12:50:01.149681 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-lbmrn"] Dec 03 12:50:01 crc kubenswrapper[4711]: W1203 12:50:01.162319 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c9a14a1_1d89_4eab_8aee_10d3c98ce90f.slice/crio-978f61e31ea406247baad69830f8c7fb2eae8e11c87ac94943ca16b12436cee6 WatchSource:0}: Error finding container 978f61e31ea406247baad69830f8c7fb2eae8e11c87ac94943ca16b12436cee6: Status 404 returned error can't find the container with id 978f61e31ea406247baad69830f8c7fb2eae8e11c87ac94943ca16b12436cee6 Dec 03 12:50:01 crc kubenswrapper[4711]: I1203 12:50:01.422631 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-8237-account-create-update-48lr9"] Dec 03 12:50:01 crc kubenswrapper[4711]: W1203 12:50:01.424994 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c7376b2_9de2_4c0f_8311_8e422b08f2cb.slice/crio-ebe8f04a10675266b1aee961f07e56d4c595f3d49d423f2348506b9b11706eb9 WatchSource:0}: Error finding container ebe8f04a10675266b1aee961f07e56d4c595f3d49d423f2348506b9b11706eb9: Status 404 returned error can't find the container with id ebe8f04a10675266b1aee961f07e56d4c595f3d49d423f2348506b9b11706eb9 Dec 03 12:50:01 crc kubenswrapper[4711]: I1203 12:50:01.664775 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-lbmrn" event={"ID":"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f","Type":"ContainerStarted","Data":"978f61e31ea406247baad69830f8c7fb2eae8e11c87ac94943ca16b12436cee6"} Dec 03 12:50:01 crc kubenswrapper[4711]: I1203 12:50:01.665808 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" event={"ID":"1c7376b2-9de2-4c0f-8311-8e422b08f2cb","Type":"ContainerStarted","Data":"ebe8f04a10675266b1aee961f07e56d4c595f3d49d423f2348506b9b11706eb9"} Dec 03 12:50:01 crc kubenswrapper[4711]: I1203 12:50:01.832037 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a177a3-ca3b-4aad-8365-791bbc65e089" path="/var/lib/kubelet/pods/92a177a3-ca3b-4aad-8365-791bbc65e089/volumes" Dec 03 12:50:01 crc kubenswrapper[4711]: I1203 12:50:01.833578 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1" path="/var/lib/kubelet/pods/a30f24fa-04a4-4d3c-aebb-ec95e57b1cc1/volumes" Dec 03 12:50:01 crc kubenswrapper[4711]: I1203 12:50:01.834723 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca20b9e6-a4ba-4ede-a898-367b7ec51f03" path="/var/lib/kubelet/pods/ca20b9e6-a4ba-4ede-a898-367b7ec51f03/volumes" Dec 03 12:50:02 crc kubenswrapper[4711]: I1203 12:50:02.678521 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-lbmrn" event={"ID":"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f","Type":"ContainerStarted","Data":"e11b1a0d824fb16772da6315083e47a8071c5073a511a2cc140db2459927701e"} Dec 03 12:50:03 crc kubenswrapper[4711]: I1203 12:50:03.690021 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" event={"ID":"1c7376b2-9de2-4c0f-8311-8e422b08f2cb","Type":"ContainerStarted","Data":"6bce48f682c206b57606dcb50948562f629ae57ce54131a03311755ba2859454"} Dec 03 12:50:03 crc kubenswrapper[4711]: I1203 12:50:03.710449 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" podStartSLOduration=3.71042773 podStartE2EDuration="3.71042773s" podCreationTimestamp="2025-12-03 12:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:03.702998808 +0000 UTC m=+2122.372250063" watchObservedRunningTime="2025-12-03 12:50:03.71042773 +0000 UTC m=+2122.379678975" Dec 03 12:50:03 crc kubenswrapper[4711]: I1203 12:50:03.725277 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-lbmrn" podStartSLOduration=3.725259937 podStartE2EDuration="3.725259937s" podCreationTimestamp="2025-12-03 12:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:03.721607026 +0000 UTC m=+2122.390858291" watchObservedRunningTime="2025-12-03 12:50:03.725259937 +0000 UTC m=+2122.394511192" Dec 03 12:50:05 crc kubenswrapper[4711]: I1203 12:50:05.713138 4711 generic.go:334] "Generic (PLEG): container finished" podID="0c9a14a1-1d89-4eab-8aee-10d3c98ce90f" containerID="e11b1a0d824fb16772da6315083e47a8071c5073a511a2cc140db2459927701e" exitCode=0 Dec 03 12:50:05 crc kubenswrapper[4711]: I1203 12:50:05.713212 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-lbmrn" event={"ID":"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f","Type":"ContainerDied","Data":"e11b1a0d824fb16772da6315083e47a8071c5073a511a2cc140db2459927701e"} Dec 03 12:50:05 crc kubenswrapper[4711]: I1203 12:50:05.716241 4711 generic.go:334] "Generic (PLEG): container finished" podID="1c7376b2-9de2-4c0f-8311-8e422b08f2cb" containerID="6bce48f682c206b57606dcb50948562f629ae57ce54131a03311755ba2859454" exitCode=0 Dec 03 12:50:05 crc kubenswrapper[4711]: I1203 12:50:05.716294 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" event={"ID":"1c7376b2-9de2-4c0f-8311-8e422b08f2cb","Type":"ContainerDied","Data":"6bce48f682c206b57606dcb50948562f629ae57ce54131a03311755ba2859454"} Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.051805 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.057953 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.119231 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-operator-scripts\") pod \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\" (UID: \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\") " Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.119291 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn7lr\" (UniqueName: \"kubernetes.io/projected/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-kube-api-access-gn7lr\") pod \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\" (UID: \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\") " Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.119357 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md8gj\" (UniqueName: \"kubernetes.io/projected/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-kube-api-access-md8gj\") pod \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\" (UID: \"1c7376b2-9de2-4c0f-8311-8e422b08f2cb\") " Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.119436 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-operator-scripts\") pod \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\" (UID: \"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f\") " Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.120392 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c9a14a1-1d89-4eab-8aee-10d3c98ce90f" (UID: "0c9a14a1-1d89-4eab-8aee-10d3c98ce90f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.121116 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c7376b2-9de2-4c0f-8311-8e422b08f2cb" (UID: "1c7376b2-9de2-4c0f-8311-8e422b08f2cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.124307 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-kube-api-access-gn7lr" (OuterVolumeSpecName: "kube-api-access-gn7lr") pod "0c9a14a1-1d89-4eab-8aee-10d3c98ce90f" (UID: "0c9a14a1-1d89-4eab-8aee-10d3c98ce90f"). InnerVolumeSpecName "kube-api-access-gn7lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.124565 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-kube-api-access-md8gj" (OuterVolumeSpecName: "kube-api-access-md8gj") pod "1c7376b2-9de2-4c0f-8311-8e422b08f2cb" (UID: "1c7376b2-9de2-4c0f-8311-8e422b08f2cb"). InnerVolumeSpecName "kube-api-access-md8gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.221072 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn7lr\" (UniqueName: \"kubernetes.io/projected/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-kube-api-access-gn7lr\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.221109 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md8gj\" (UniqueName: \"kubernetes.io/projected/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-kube-api-access-md8gj\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.221123 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.221135 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c7376b2-9de2-4c0f-8311-8e422b08f2cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.736657 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-lbmrn" event={"ID":"0c9a14a1-1d89-4eab-8aee-10d3c98ce90f","Type":"ContainerDied","Data":"978f61e31ea406247baad69830f8c7fb2eae8e11c87ac94943ca16b12436cee6"} Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.736687 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-lbmrn" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.736703 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978f61e31ea406247baad69830f8c7fb2eae8e11c87ac94943ca16b12436cee6" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.738412 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" event={"ID":"1c7376b2-9de2-4c0f-8311-8e422b08f2cb","Type":"ContainerDied","Data":"ebe8f04a10675266b1aee961f07e56d4c595f3d49d423f2348506b9b11706eb9"} Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.738442 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebe8f04a10675266b1aee961f07e56d4c595f3d49d423f2348506b9b11706eb9" Dec 03 12:50:07 crc kubenswrapper[4711]: I1203 12:50:07.738462 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-8237-account-create-update-48lr9" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.793756 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-g5vp7"] Dec 03 12:50:10 crc kubenswrapper[4711]: E1203 12:50:10.794406 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7376b2-9de2-4c0f-8311-8e422b08f2cb" containerName="mariadb-account-create-update" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.794422 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7376b2-9de2-4c0f-8311-8e422b08f2cb" containerName="mariadb-account-create-update" Dec 03 12:50:10 crc kubenswrapper[4711]: E1203 12:50:10.794439 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9a14a1-1d89-4eab-8aee-10d3c98ce90f" containerName="mariadb-database-create" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.794446 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9a14a1-1d89-4eab-8aee-10d3c98ce90f" containerName="mariadb-database-create" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.794600 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7376b2-9de2-4c0f-8311-8e422b08f2cb" containerName="mariadb-account-create-update" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.794619 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9a14a1-1d89-4eab-8aee-10d3c98ce90f" containerName="mariadb-database-create" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.795169 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.800834 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-8dvvn" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.801193 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.808804 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g5vp7"] Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.889087 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx7rp\" (UniqueName: \"kubernetes.io/projected/70e886bc-d0fc-4024-a751-3bc1e4233c3d-kube-api-access-kx7rp\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.889291 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-db-sync-config-data\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.889348 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-config-data\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.991664 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-db-sync-config-data\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.991978 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-config-data\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:10 crc kubenswrapper[4711]: I1203 12:50:10.992031 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx7rp\" (UniqueName: \"kubernetes.io/projected/70e886bc-d0fc-4024-a751-3bc1e4233c3d-kube-api-access-kx7rp\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:11 crc kubenswrapper[4711]: I1203 12:50:10.997759 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-db-sync-config-data\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:11 crc kubenswrapper[4711]: I1203 12:50:11.004948 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-config-data\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:11 crc kubenswrapper[4711]: I1203 12:50:11.007237 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx7rp\" (UniqueName: \"kubernetes.io/projected/70e886bc-d0fc-4024-a751-3bc1e4233c3d-kube-api-access-kx7rp\") pod \"glance-db-sync-g5vp7\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:11 crc kubenswrapper[4711]: I1203 12:50:11.120160 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:11 crc kubenswrapper[4711]: I1203 12:50:11.642001 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g5vp7"] Dec 03 12:50:11 crc kubenswrapper[4711]: I1203 12:50:11.785352 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g5vp7" event={"ID":"70e886bc-d0fc-4024-a751-3bc1e4233c3d","Type":"ContainerStarted","Data":"f73f639bbd8405b9eecdc25c55d21923541e7daff0db6b27ad4711f0c68b2768"} Dec 03 12:50:12 crc kubenswrapper[4711]: I1203 12:50:12.795045 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g5vp7" event={"ID":"70e886bc-d0fc-4024-a751-3bc1e4233c3d","Type":"ContainerStarted","Data":"00512b127b4de6603bd549e8a899549fc0a1fb96678d6cb33014e2d9fdeaf07a"} Dec 03 12:50:12 crc kubenswrapper[4711]: I1203 12:50:12.811224 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-g5vp7" podStartSLOduration=2.811207523 podStartE2EDuration="2.811207523s" podCreationTimestamp="2025-12-03 12:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:12.810616447 +0000 UTC m=+2131.479867702" watchObservedRunningTime="2025-12-03 12:50:12.811207523 +0000 UTC m=+2131.480458778" Dec 03 12:50:15 crc kubenswrapper[4711]: I1203 12:50:15.827466 4711 generic.go:334] "Generic (PLEG): container finished" podID="70e886bc-d0fc-4024-a751-3bc1e4233c3d" containerID="00512b127b4de6603bd549e8a899549fc0a1fb96678d6cb33014e2d9fdeaf07a" exitCode=0 Dec 03 12:50:15 crc kubenswrapper[4711]: I1203 12:50:15.835665 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g5vp7" event={"ID":"70e886bc-d0fc-4024-a751-3bc1e4233c3d","Type":"ContainerDied","Data":"00512b127b4de6603bd549e8a899549fc0a1fb96678d6cb33014e2d9fdeaf07a"} Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.138142 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.233568 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx7rp\" (UniqueName: \"kubernetes.io/projected/70e886bc-d0fc-4024-a751-3bc1e4233c3d-kube-api-access-kx7rp\") pod \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.233625 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-config-data\") pod \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.233715 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-db-sync-config-data\") pod \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\" (UID: \"70e886bc-d0fc-4024-a751-3bc1e4233c3d\") " Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.239567 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e886bc-d0fc-4024-a751-3bc1e4233c3d-kube-api-access-kx7rp" (OuterVolumeSpecName: "kube-api-access-kx7rp") pod "70e886bc-d0fc-4024-a751-3bc1e4233c3d" (UID: "70e886bc-d0fc-4024-a751-3bc1e4233c3d"). InnerVolumeSpecName "kube-api-access-kx7rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.240395 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "70e886bc-d0fc-4024-a751-3bc1e4233c3d" (UID: "70e886bc-d0fc-4024-a751-3bc1e4233c3d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.277108 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-config-data" (OuterVolumeSpecName: "config-data") pod "70e886bc-d0fc-4024-a751-3bc1e4233c3d" (UID: "70e886bc-d0fc-4024-a751-3bc1e4233c3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.336031 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx7rp\" (UniqueName: \"kubernetes.io/projected/70e886bc-d0fc-4024-a751-3bc1e4233c3d-kube-api-access-kx7rp\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.336100 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.336131 4711 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e886bc-d0fc-4024-a751-3bc1e4233c3d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.843104 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g5vp7" event={"ID":"70e886bc-d0fc-4024-a751-3bc1e4233c3d","Type":"ContainerDied","Data":"f73f639bbd8405b9eecdc25c55d21923541e7daff0db6b27ad4711f0c68b2768"} Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.843148 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f73f639bbd8405b9eecdc25c55d21923541e7daff0db6b27ad4711f0c68b2768" Dec 03 12:50:17 crc kubenswrapper[4711]: I1203 12:50:17.843201 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g5vp7" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.371303 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:18 crc kubenswrapper[4711]: E1203 12:50:18.371569 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e886bc-d0fc-4024-a751-3bc1e4233c3d" containerName="glance-db-sync" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.371581 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e886bc-d0fc-4024-a751-3bc1e4233c3d" containerName="glance-db-sync" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.371702 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e886bc-d0fc-4024-a751-3bc1e4233c3d" containerName="glance-db-sync" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.372385 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.374568 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.374769 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-8dvvn" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.378804 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.392128 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556004 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-scripts\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556054 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556071 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556088 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-run\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556107 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-dev\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556125 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-httpd-run\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556144 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdmk\" (UniqueName: \"kubernetes.io/projected/ef3c395a-16fc-46eb-9a8a-6885e4854610-kube-api-access-zvdmk\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556175 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556189 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-lib-modules\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556311 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-config-data\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556421 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-nvme\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556507 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-sys\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556553 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.556578 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-logs\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658358 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658703 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-lib-modules\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658740 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-config-data\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658498 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658780 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-nvme\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658840 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-lib-modules\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658849 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-sys\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658873 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-sys\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658919 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658963 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.658978 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-logs\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659047 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-scripts\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659089 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659111 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659136 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-run\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659190 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-dev\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659221 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-httpd-run\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659246 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdmk\" (UniqueName: \"kubernetes.io/projected/ef3c395a-16fc-46eb-9a8a-6885e4854610-kube-api-access-zvdmk\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659302 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-run\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659435 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659452 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659550 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-logs\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659596 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-nvme\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659683 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-httpd-run\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.659760 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-dev\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.665893 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-config-data\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.667458 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-scripts\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.681540 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdmk\" (UniqueName: \"kubernetes.io/projected/ef3c395a-16fc-46eb-9a8a-6885e4854610-kube-api-access-zvdmk\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.682555 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.683248 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:18 crc kubenswrapper[4711]: I1203 12:50:18.686595 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:19 crc kubenswrapper[4711]: I1203 12:50:19.046311 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:19 crc kubenswrapper[4711]: I1203 12:50:19.117760 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:19 crc kubenswrapper[4711]: I1203 12:50:19.859404 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ef3c395a-16fc-46eb-9a8a-6885e4854610","Type":"ContainerStarted","Data":"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86"} Dec 03 12:50:19 crc kubenswrapper[4711]: I1203 12:50:19.859780 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ef3c395a-16fc-46eb-9a8a-6885e4854610","Type":"ContainerStarted","Data":"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe"} Dec 03 12:50:19 crc kubenswrapper[4711]: I1203 12:50:19.859792 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ef3c395a-16fc-46eb-9a8a-6885e4854610","Type":"ContainerStarted","Data":"cdc9b94b05dc3458d80fc85b81c98b63db5a63e97527769cdbce12f0dd33251c"} Dec 03 12:50:19 crc kubenswrapper[4711]: I1203 12:50:19.859632 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerName="glance-httpd" containerID="cri-o://5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86" gracePeriod=30 Dec 03 12:50:19 crc kubenswrapper[4711]: I1203 12:50:19.859590 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerName="glance-log" containerID="cri-o://777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe" gracePeriod=30 Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.287600 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.384299 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.384939 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-nvme\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385040 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385160 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385272 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-sys\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385381 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-config-data\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385497 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-iscsi\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385615 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-lib-modules\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385710 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-scripts\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385827 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-httpd-run\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385975 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvdmk\" (UniqueName: \"kubernetes.io/projected/ef3c395a-16fc-46eb-9a8a-6885e4854610-kube-api-access-zvdmk\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386088 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-logs\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385318 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-sys" (OuterVolumeSpecName: "sys") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385581 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.385688 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386354 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-run\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386472 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-dev\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386565 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-var-locks-brick\") pod \"ef3c395a-16fc-46eb-9a8a-6885e4854610\" (UID: \"ef3c395a-16fc-46eb-9a8a-6885e4854610\") " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386150 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386593 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-run" (OuterVolumeSpecName: "run") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386593 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-dev" (OuterVolumeSpecName: "dev") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386640 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.386794 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-logs" (OuterVolumeSpecName: "logs") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387250 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387337 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387411 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387513 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387590 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387669 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3c395a-16fc-46eb-9a8a-6885e4854610-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387740 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387820 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.387893 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef3c395a-16fc-46eb-9a8a-6885e4854610-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.389480 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.389547 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3c395a-16fc-46eb-9a8a-6885e4854610-kube-api-access-zvdmk" (OuterVolumeSpecName: "kube-api-access-zvdmk") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "kube-api-access-zvdmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.389691 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-scripts" (OuterVolumeSpecName: "scripts") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.390645 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.422311 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-config-data" (OuterVolumeSpecName: "config-data") pod "ef3c395a-16fc-46eb-9a8a-6885e4854610" (UID: "ef3c395a-16fc-46eb-9a8a-6885e4854610"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.489033 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.489071 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.489085 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.489097 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3c395a-16fc-46eb-9a8a-6885e4854610-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.489110 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvdmk\" (UniqueName: \"kubernetes.io/projected/ef3c395a-16fc-46eb-9a8a-6885e4854610-kube-api-access-zvdmk\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.502223 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.505099 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.591099 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.591441 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.871798 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.871808 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ef3c395a-16fc-46eb-9a8a-6885e4854610","Type":"ContainerDied","Data":"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86"} Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.871821 4711 generic.go:334] "Generic (PLEG): container finished" podID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerID="5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86" exitCode=143 Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.872310 4711 scope.go:117] "RemoveContainer" containerID="5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.872333 4711 generic.go:334] "Generic (PLEG): container finished" podID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerID="777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe" exitCode=143 Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.872364 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ef3c395a-16fc-46eb-9a8a-6885e4854610","Type":"ContainerDied","Data":"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe"} Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.872388 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ef3c395a-16fc-46eb-9a8a-6885e4854610","Type":"ContainerDied","Data":"cdc9b94b05dc3458d80fc85b81c98b63db5a63e97527769cdbce12f0dd33251c"} Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.909507 4711 scope.go:117] "RemoveContainer" containerID="777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.914699 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.920202 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.938097 4711 scope.go:117] "RemoveContainer" containerID="5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86" Dec 03 12:50:20 crc kubenswrapper[4711]: E1203 12:50:20.938605 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86\": container with ID starting with 5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86 not found: ID does not exist" containerID="5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.938651 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86"} err="failed to get container status \"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86\": rpc error: code = NotFound desc = could not find container \"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86\": container with ID starting with 5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86 not found: ID does not exist" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.938679 4711 scope.go:117] "RemoveContainer" containerID="777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe" Dec 03 12:50:20 crc kubenswrapper[4711]: E1203 12:50:20.938953 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe\": container with ID starting with 777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe not found: ID does not exist" containerID="777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.938978 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe"} err="failed to get container status \"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe\": rpc error: code = NotFound desc = could not find container \"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe\": container with ID starting with 777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe not found: ID does not exist" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.938992 4711 scope.go:117] "RemoveContainer" containerID="5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.939159 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86"} err="failed to get container status \"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86\": rpc error: code = NotFound desc = could not find container \"5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86\": container with ID starting with 5f98bb814084f9ecbe2655e485964c642a49bebdcb44cb7d19052b8890248c86 not found: ID does not exist" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.939182 4711 scope.go:117] "RemoveContainer" containerID="777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.939346 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe"} err="failed to get container status \"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe\": rpc error: code = NotFound desc = could not find container \"777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe\": container with ID starting with 777e1c37b63884dafc0965cbecdc2ae6c34bd2db09775325f75c23137665fcfe not found: ID does not exist" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.941506 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:20 crc kubenswrapper[4711]: E1203 12:50:20.942217 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerName="glance-httpd" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.942237 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerName="glance-httpd" Dec 03 12:50:20 crc kubenswrapper[4711]: E1203 12:50:20.942261 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerName="glance-log" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.942268 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerName="glance-log" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.942396 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerName="glance-httpd" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.942415 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" containerName="glance-log" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.943215 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.945901 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.946509 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.951332 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:20 crc kubenswrapper[4711]: I1203 12:50:20.953312 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-8dvvn" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.097973 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-dev\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.098217 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-logs\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.098325 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-run\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.098438 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdxvp\" (UniqueName: \"kubernetes.io/projected/d62c935e-dea8-493a-a087-7cbf5c2a9542-kube-api-access-cdxvp\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.098537 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-config-data\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.098651 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.098751 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-httpd-run\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.098842 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-scripts\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.098944 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.099024 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-sys\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.099153 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-lib-modules\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.099256 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-nvme\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.099345 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.099422 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201102 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdxvp\" (UniqueName: \"kubernetes.io/projected/d62c935e-dea8-493a-a087-7cbf5c2a9542-kube-api-access-cdxvp\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201185 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-config-data\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201215 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201251 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-httpd-run\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201280 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-scripts\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201308 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201331 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-sys\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201380 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-lib-modules\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201410 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201430 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201451 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-nvme\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201490 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-dev\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201512 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-logs\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201525 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201586 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-run\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201668 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-run\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201702 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-sys\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201729 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-lib-modules\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.201947 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.202102 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-nvme\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.202105 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-dev\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.202206 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.202224 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.202541 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-httpd-run\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.202705 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-logs\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.207402 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-scripts\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.208656 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-config-data\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.232078 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdxvp\" (UniqueName: \"kubernetes.io/projected/d62c935e-dea8-493a-a087-7cbf5c2a9542-kube-api-access-cdxvp\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.238418 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.243574 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.264433 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.708779 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.826089 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3c395a-16fc-46eb-9a8a-6885e4854610" path="/var/lib/kubelet/pods/ef3c395a-16fc-46eb-9a8a-6885e4854610/volumes" Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.883235 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"d62c935e-dea8-493a-a087-7cbf5c2a9542","Type":"ContainerStarted","Data":"56c3fc50fd89b132f87c1a7f6058c3d4785f1e2bf13d99de4db26f7bf1d030e3"} Dec 03 12:50:21 crc kubenswrapper[4711]: I1203 12:50:21.883569 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"d62c935e-dea8-493a-a087-7cbf5c2a9542","Type":"ContainerStarted","Data":"00860316c30ef0e078f2bd6c4260646b36a3480130cdf53a5d75d8a90f5ef778"} Dec 03 12:50:22 crc kubenswrapper[4711]: I1203 12:50:22.891499 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"d62c935e-dea8-493a-a087-7cbf5c2a9542","Type":"ContainerStarted","Data":"ce31d16d12772ab6e0cb70ec5d1ceaa19ffa130fadd855b4e7f4cb6154c2d5dd"} Dec 03 12:50:22 crc kubenswrapper[4711]: I1203 12:50:22.922041 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.922019782 podStartE2EDuration="2.922019782s" podCreationTimestamp="2025-12-03 12:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:22.918122936 +0000 UTC m=+2141.587374211" watchObservedRunningTime="2025-12-03 12:50:22.922019782 +0000 UTC m=+2141.591271037" Dec 03 12:50:31 crc kubenswrapper[4711]: I1203 12:50:31.264827 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:31 crc kubenswrapper[4711]: I1203 12:50:31.265339 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:31 crc kubenswrapper[4711]: I1203 12:50:31.294740 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:31 crc kubenswrapper[4711]: I1203 12:50:31.328458 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:31 crc kubenswrapper[4711]: I1203 12:50:31.972077 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:31 crc kubenswrapper[4711]: I1203 12:50:31.972124 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:33 crc kubenswrapper[4711]: I1203 12:50:33.902506 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:33 crc kubenswrapper[4711]: I1203 12:50:33.984738 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:50:34 crc kubenswrapper[4711]: I1203 12:50:34.025030 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.045526 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.046888 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.053370 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.054823 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.063340 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.080217 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.235076 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-config-data\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.235396 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.235487 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-nvme\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.235566 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-config-data\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.235652 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-logs\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.235749 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-sys\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.235836 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.235947 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-scripts\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236029 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-run\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236111 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spj4p\" (UniqueName: \"kubernetes.io/projected/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-kube-api-access-spj4p\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236204 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-nvme\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236230 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-scripts\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236255 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-httpd-run\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236273 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-dev\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236292 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236322 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-httpd-run\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236343 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-run\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236367 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236413 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236432 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkwx\" (UniqueName: \"kubernetes.io/projected/7a68f942-f23a-4131-b62a-70f726cee074-kube-api-access-bqkwx\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236453 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236471 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-lib-modules\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236487 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-logs\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236508 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-dev\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236525 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236538 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-sys\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236575 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.236599 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-lib-modules\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.338468 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-lib-modules\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.338586 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-lib-modules\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.338839 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-config-data\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.338972 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339007 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-nvme\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339050 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-config-data\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339105 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-logs\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339132 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-nvme\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339405 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-sys\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339439 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339461 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-scripts\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339484 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-run\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339505 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-sys\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339508 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spj4p\" (UniqueName: \"kubernetes.io/projected/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-kube-api-access-spj4p\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339627 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-nvme\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339649 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-scripts\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339696 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-httpd-run\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339712 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339793 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-nvme\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339728 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-dev\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.339878 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-logs\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340024 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340058 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-run\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340323 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-httpd-run\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340376 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340425 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-httpd-run\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340466 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-run\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340503 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340510 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340528 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkwx\" (UniqueName: \"kubernetes.io/projected/7a68f942-f23a-4131-b62a-70f726cee074-kube-api-access-bqkwx\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340744 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-run\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340781 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340823 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340854 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-lib-modules\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340882 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-logs\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340938 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-dev\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340968 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.340992 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-sys\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341028 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341141 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341181 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341206 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-dev\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341219 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-httpd-run\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341286 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341572 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-logs\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341624 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341664 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-lib-modules\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.341788 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.343173 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-dev\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.343377 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-sys\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.345891 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-scripts\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.347538 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-scripts\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.349071 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-config-data\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.362146 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-config-data\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.370432 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkwx\" (UniqueName: \"kubernetes.io/projected/7a68f942-f23a-4131-b62a-70f726cee074-kube-api-access-bqkwx\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.379076 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.380152 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spj4p\" (UniqueName: \"kubernetes.io/projected/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-kube-api-access-spj4p\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.389842 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.392548 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-2\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.396584 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.667586 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.675949 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:36 crc kubenswrapper[4711]: I1203 12:50:36.963664 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 03 12:50:37 crc kubenswrapper[4711]: I1203 12:50:37.021776 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"e6bd59da-4fe4-4f9d-9b5c-c14066faba60","Type":"ContainerStarted","Data":"9042e53619ba159644aa4c995a8ec3c0a1801aeeeeb6b398c07324b5e90d333a"} Dec 03 12:50:37 crc kubenswrapper[4711]: I1203 12:50:37.219198 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:50:38 crc kubenswrapper[4711]: I1203 12:50:38.032262 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"e6bd59da-4fe4-4f9d-9b5c-c14066faba60","Type":"ContainerStarted","Data":"4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736"} Dec 03 12:50:38 crc kubenswrapper[4711]: I1203 12:50:38.033014 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"e6bd59da-4fe4-4f9d-9b5c-c14066faba60","Type":"ContainerStarted","Data":"09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f"} Dec 03 12:50:38 crc kubenswrapper[4711]: I1203 12:50:38.034264 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7a68f942-f23a-4131-b62a-70f726cee074","Type":"ContainerStarted","Data":"4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987"} Dec 03 12:50:38 crc kubenswrapper[4711]: I1203 12:50:38.034302 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7a68f942-f23a-4131-b62a-70f726cee074","Type":"ContainerStarted","Data":"9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee"} Dec 03 12:50:38 crc kubenswrapper[4711]: I1203 12:50:38.034315 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7a68f942-f23a-4131-b62a-70f726cee074","Type":"ContainerStarted","Data":"8c80902e72823f22021bb68a114e4497eac8d1fc8b9f1c69c2d6cc801cf6816e"} Dec 03 12:50:38 crc kubenswrapper[4711]: I1203 12:50:38.058736 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-2" podStartSLOduration=3.058709919 podStartE2EDuration="3.058709919s" podCreationTimestamp="2025-12-03 12:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:38.051492453 +0000 UTC m=+2156.720743728" watchObservedRunningTime="2025-12-03 12:50:38.058709919 +0000 UTC m=+2156.727961174" Dec 03 12:50:38 crc kubenswrapper[4711]: I1203 12:50:38.074587 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=3.074570843 podStartE2EDuration="3.074570843s" podCreationTimestamp="2025-12-03 12:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:38.073197236 +0000 UTC m=+2156.742448491" watchObservedRunningTime="2025-12-03 12:50:38.074570843 +0000 UTC m=+2156.743822098" Dec 03 12:50:46 crc kubenswrapper[4711]: I1203 12:50:46.668832 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:46 crc kubenswrapper[4711]: I1203 12:50:46.669417 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:46 crc kubenswrapper[4711]: I1203 12:50:46.676918 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:46 crc kubenswrapper[4711]: I1203 12:50:46.676959 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:46 crc kubenswrapper[4711]: I1203 12:50:46.708288 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:46 crc kubenswrapper[4711]: I1203 12:50:46.731066 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:46 crc kubenswrapper[4711]: I1203 12:50:46.736097 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:46 crc kubenswrapper[4711]: I1203 12:50:46.746807 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:47 crc kubenswrapper[4711]: I1203 12:50:47.117638 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:47 crc kubenswrapper[4711]: I1203 12:50:47.118360 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:47 crc kubenswrapper[4711]: I1203 12:50:47.118537 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:47 crc kubenswrapper[4711]: I1203 12:50:47.118691 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:49 crc kubenswrapper[4711]: I1203 12:50:49.002781 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:49 crc kubenswrapper[4711]: I1203 12:50:49.036121 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:49 crc kubenswrapper[4711]: I1203 12:50:49.036717 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:49 crc kubenswrapper[4711]: I1203 12:50:49.110083 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:49 crc kubenswrapper[4711]: I1203 12:50:49.788297 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 03 12:50:49 crc kubenswrapper[4711]: I1203 12:50:49.803094 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:50:51 crc kubenswrapper[4711]: I1203 12:50:51.150648 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerName="glance-log" containerID="cri-o://09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f" gracePeriod=30 Dec 03 12:50:51 crc kubenswrapper[4711]: I1203 12:50:51.152229 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="7a68f942-f23a-4131-b62a-70f726cee074" containerName="glance-log" containerID="cri-o://9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee" gracePeriod=30 Dec 03 12:50:51 crc kubenswrapper[4711]: I1203 12:50:51.152283 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerName="glance-httpd" containerID="cri-o://4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736" gracePeriod=30 Dec 03 12:50:51 crc kubenswrapper[4711]: I1203 12:50:51.152458 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="7a68f942-f23a-4131-b62a-70f726cee074" containerName="glance-httpd" containerID="cri-o://4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987" gracePeriod=30 Dec 03 12:50:52 crc kubenswrapper[4711]: I1203 12:50:52.162153 4711 generic.go:334] "Generic (PLEG): container finished" podID="7a68f942-f23a-4131-b62a-70f726cee074" containerID="9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee" exitCode=143 Dec 03 12:50:52 crc kubenswrapper[4711]: I1203 12:50:52.162254 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7a68f942-f23a-4131-b62a-70f726cee074","Type":"ContainerDied","Data":"9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee"} Dec 03 12:50:52 crc kubenswrapper[4711]: I1203 12:50:52.165268 4711 generic.go:334] "Generic (PLEG): container finished" podID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerID="09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f" exitCode=143 Dec 03 12:50:52 crc kubenswrapper[4711]: I1203 12:50:52.165303 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"e6bd59da-4fe4-4f9d-9b5c-c14066faba60","Type":"ContainerDied","Data":"09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f"} Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.618393 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.725091 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.739804 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-lib-modules\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.739873 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-nvme\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.739964 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-dev\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740046 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740119 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spj4p\" (UniqueName: \"kubernetes.io/projected/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-kube-api-access-spj4p\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740152 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-iscsi\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740218 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-httpd-run\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740277 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-scripts\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740307 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-var-locks-brick\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740329 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740374 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-sys\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740393 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-run\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740537 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-logs\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.740577 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-config-data\") pod \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\" (UID: \"e6bd59da-4fe4-4f9d-9b5c-c14066faba60\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.742177 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.742194 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-dev" (OuterVolumeSpecName: "dev") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.742239 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.742429 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.742507 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.743605 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-sys" (OuterVolumeSpecName: "sys") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.743623 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-run" (OuterVolumeSpecName: "run") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.743615 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.743830 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-logs" (OuterVolumeSpecName: "logs") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.755737 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-kube-api-access-spj4p" (OuterVolumeSpecName: "kube-api-access-spj4p") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "kube-api-access-spj4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.756065 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.758807 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.759424 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-scripts" (OuterVolumeSpecName: "scripts") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.786763 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-config-data" (OuterVolumeSpecName: "config-data") pod "e6bd59da-4fe4-4f9d-9b5c-c14066faba60" (UID: "e6bd59da-4fe4-4f9d-9b5c-c14066faba60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.841664 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-logs\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842026 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-var-locks-brick\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842072 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-nvme\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842098 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-iscsi\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842139 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-sys\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842136 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842165 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-scripts\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842186 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-run\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842194 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842215 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842228 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842265 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqkwx\" (UniqueName: \"kubernetes.io/projected/7a68f942-f23a-4131-b62a-70f726cee074-kube-api-access-bqkwx\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842320 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842338 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-dev\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842373 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-httpd-run\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842381 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-logs" (OuterVolumeSpecName: "logs") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842389 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-lib-modules\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842436 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842469 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-run" (OuterVolumeSpecName: "run") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842489 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-dev" (OuterVolumeSpecName: "dev") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842505 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-config-data\") pod \"7a68f942-f23a-4131-b62a-70f726cee074\" (UID: \"7a68f942-f23a-4131-b62a-70f726cee074\") " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.842692 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843184 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843199 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843207 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843217 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843226 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843245 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843255 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843263 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843272 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843280 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843288 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843295 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843303 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843312 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843323 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843345 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843353 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843360 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68f942-f23a-4131-b62a-70f726cee074-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843368 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843383 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843391 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spj4p\" (UniqueName: \"kubernetes.io/projected/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-kube-api-access-spj4p\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.843400 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6bd59da-4fe4-4f9d-9b5c-c14066faba60-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.845137 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-sys" (OuterVolumeSpecName: "sys") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.845476 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.846465 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.846600 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-scripts" (OuterVolumeSpecName: "scripts") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.849874 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a68f942-f23a-4131-b62a-70f726cee074-kube-api-access-bqkwx" (OuterVolumeSpecName: "kube-api-access-bqkwx") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "kube-api-access-bqkwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.858739 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.865399 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.892264 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-config-data" (OuterVolumeSpecName: "config-data") pod "7a68f942-f23a-4131-b62a-70f726cee074" (UID: "7a68f942-f23a-4131-b62a-70f726cee074"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.945481 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.945743 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a68f942-f23a-4131-b62a-70f726cee074-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.945817 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.945900 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.946010 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqkwx\" (UniqueName: \"kubernetes.io/projected/7a68f942-f23a-4131-b62a-70f726cee074-kube-api-access-bqkwx\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.946128 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.946200 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.946268 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68f942-f23a-4131-b62a-70f726cee074-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.973527 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 12:50:54 crc kubenswrapper[4711]: I1203 12:50:54.976743 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.047463 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.047500 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.197780 4711 generic.go:334] "Generic (PLEG): container finished" podID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerID="4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736" exitCode=0 Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.197937 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"e6bd59da-4fe4-4f9d-9b5c-c14066faba60","Type":"ContainerDied","Data":"4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736"} Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.197975 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"e6bd59da-4fe4-4f9d-9b5c-c14066faba60","Type":"ContainerDied","Data":"9042e53619ba159644aa4c995a8ec3c0a1801aeeeeb6b398c07324b5e90d333a"} Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.197998 4711 scope.go:117] "RemoveContainer" containerID="4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.198197 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.200346 4711 generic.go:334] "Generic (PLEG): container finished" podID="7a68f942-f23a-4131-b62a-70f726cee074" containerID="4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987" exitCode=0 Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.200399 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7a68f942-f23a-4131-b62a-70f726cee074","Type":"ContainerDied","Data":"4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987"} Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.200431 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.200438 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7a68f942-f23a-4131-b62a-70f726cee074","Type":"ContainerDied","Data":"8c80902e72823f22021bb68a114e4497eac8d1fc8b9f1c69c2d6cc801cf6816e"} Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.226058 4711 scope.go:117] "RemoveContainer" containerID="09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.258662 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.264979 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.267732 4711 scope.go:117] "RemoveContainer" containerID="4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736" Dec 03 12:50:55 crc kubenswrapper[4711]: E1203 12:50:55.268924 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736\": container with ID starting with 4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736 not found: ID does not exist" containerID="4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.268985 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736"} err="failed to get container status \"4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736\": rpc error: code = NotFound desc = could not find container \"4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736\": container with ID starting with 4a03670217764c13d48577cae5121a1b9a3e885443a6c274f7dd0f99642f5736 not found: ID does not exist" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.269013 4711 scope.go:117] "RemoveContainer" containerID="09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f" Dec 03 12:50:55 crc kubenswrapper[4711]: E1203 12:50:55.269383 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f\": container with ID starting with 09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f not found: ID does not exist" containerID="09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.269410 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f"} err="failed to get container status \"09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f\": rpc error: code = NotFound desc = could not find container \"09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f\": container with ID starting with 09a6c83fabe159d844e638112e6b5427af45fb0c064d27276262e1b84800900f not found: ID does not exist" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.269428 4711 scope.go:117] "RemoveContainer" containerID="4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.273571 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.282196 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.295015 4711 scope.go:117] "RemoveContainer" containerID="9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.315351 4711 scope.go:117] "RemoveContainer" containerID="4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987" Dec 03 12:50:55 crc kubenswrapper[4711]: E1203 12:50:55.315937 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987\": container with ID starting with 4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987 not found: ID does not exist" containerID="4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.316003 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987"} err="failed to get container status \"4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987\": rpc error: code = NotFound desc = could not find container \"4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987\": container with ID starting with 4e18118da860925154891801320103abd44df5f5a393d765d10ea07b3993a987 not found: ID does not exist" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.316036 4711 scope.go:117] "RemoveContainer" containerID="9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee" Dec 03 12:50:55 crc kubenswrapper[4711]: E1203 12:50:55.316539 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee\": container with ID starting with 9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee not found: ID does not exist" containerID="9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.316570 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee"} err="failed to get container status \"9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee\": rpc error: code = NotFound desc = could not find container \"9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee\": container with ID starting with 9765e1f07a388e4dae1d48ecb1e6a556082806ba40df75a23abb570746c7aaee not found: ID does not exist" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.829702 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a68f942-f23a-4131-b62a-70f726cee074" path="/var/lib/kubelet/pods/7a68f942-f23a-4131-b62a-70f726cee074/volumes" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.830845 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" path="/var/lib/kubelet/pods/e6bd59da-4fe4-4f9d-9b5c-c14066faba60/volumes" Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.900211 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.900486 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerName="glance-log" containerID="cri-o://56c3fc50fd89b132f87c1a7f6058c3d4785f1e2bf13d99de4db26f7bf1d030e3" gracePeriod=30 Dec 03 12:50:55 crc kubenswrapper[4711]: I1203 12:50:55.900562 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerName="glance-httpd" containerID="cri-o://ce31d16d12772ab6e0cb70ec5d1ceaa19ffa130fadd855b4e7f4cb6154c2d5dd" gracePeriod=30 Dec 03 12:50:56 crc kubenswrapper[4711]: I1203 12:50:56.213094 4711 generic.go:334] "Generic (PLEG): container finished" podID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerID="56c3fc50fd89b132f87c1a7f6058c3d4785f1e2bf13d99de4db26f7bf1d030e3" exitCode=143 Dec 03 12:50:56 crc kubenswrapper[4711]: I1203 12:50:56.213138 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"d62c935e-dea8-493a-a087-7cbf5c2a9542","Type":"ContainerDied","Data":"56c3fc50fd89b132f87c1a7f6058c3d4785f1e2bf13d99de4db26f7bf1d030e3"} Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.240641 4711 generic.go:334] "Generic (PLEG): container finished" podID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerID="ce31d16d12772ab6e0cb70ec5d1ceaa19ffa130fadd855b4e7f4cb6154c2d5dd" exitCode=0 Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.240729 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"d62c935e-dea8-493a-a087-7cbf5c2a9542","Type":"ContainerDied","Data":"ce31d16d12772ab6e0cb70ec5d1ceaa19ffa130fadd855b4e7f4cb6154c2d5dd"} Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.438705 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522259 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-run\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522314 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-iscsi\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522397 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdxvp\" (UniqueName: \"kubernetes.io/projected/d62c935e-dea8-493a-a087-7cbf5c2a9542-kube-api-access-cdxvp\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522419 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-logs\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522472 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522502 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-scripts\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522533 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-nvme\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522548 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-sys\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522561 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-dev\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522605 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-httpd-run\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522617 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522658 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-lib-modules\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522700 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-config-data\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.522740 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-var-locks-brick\") pod \"d62c935e-dea8-493a-a087-7cbf5c2a9542\" (UID: \"d62c935e-dea8-493a-a087-7cbf5c2a9542\") " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.523031 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.523067 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-run" (OuterVolumeSpecName: "run") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.523083 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.523780 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-sys" (OuterVolumeSpecName: "sys") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.523891 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-dev" (OuterVolumeSpecName: "dev") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.524114 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.524127 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-logs" (OuterVolumeSpecName: "logs") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.524389 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.523991 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.528713 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-scripts" (OuterVolumeSpecName: "scripts") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.531058 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.531131 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62c935e-dea8-493a-a087-7cbf5c2a9542-kube-api-access-cdxvp" (OuterVolumeSpecName: "kube-api-access-cdxvp") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "kube-api-access-cdxvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.531461 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.567820 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-config-data" (OuterVolumeSpecName: "config-data") pod "d62c935e-dea8-493a-a087-7cbf5c2a9542" (UID: "d62c935e-dea8-493a-a087-7cbf5c2a9542"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624768 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624815 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624826 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624836 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624873 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624888 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624898 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624913 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624938 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624949 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d62c935e-dea8-493a-a087-7cbf5c2a9542-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624961 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdxvp\" (UniqueName: \"kubernetes.io/projected/d62c935e-dea8-493a-a087-7cbf5c2a9542-kube-api-access-cdxvp\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624971 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d62c935e-dea8-493a-a087-7cbf5c2a9542-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.624989 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.625001 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d62c935e-dea8-493a-a087-7cbf5c2a9542-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.648838 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.654571 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.726860 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:59 crc kubenswrapper[4711]: I1203 12:50:59.726934 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:00 crc kubenswrapper[4711]: I1203 12:51:00.254664 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"d62c935e-dea8-493a-a087-7cbf5c2a9542","Type":"ContainerDied","Data":"00860316c30ef0e078f2bd6c4260646b36a3480130cdf53a5d75d8a90f5ef778"} Dec 03 12:51:00 crc kubenswrapper[4711]: I1203 12:51:00.254743 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 03 12:51:00 crc kubenswrapper[4711]: I1203 12:51:00.255772 4711 scope.go:117] "RemoveContainer" containerID="ce31d16d12772ab6e0cb70ec5d1ceaa19ffa130fadd855b4e7f4cb6154c2d5dd" Dec 03 12:51:00 crc kubenswrapper[4711]: I1203 12:51:00.283767 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:51:00 crc kubenswrapper[4711]: I1203 12:51:00.286302 4711 scope.go:117] "RemoveContainer" containerID="56c3fc50fd89b132f87c1a7f6058c3d4785f1e2bf13d99de4db26f7bf1d030e3" Dec 03 12:51:00 crc kubenswrapper[4711]: I1203 12:51:00.290888 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.298436 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g5vp7"] Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.307108 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g5vp7"] Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.333936 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance8237-account-delete-56hbg"] Dec 03 12:51:01 crc kubenswrapper[4711]: E1203 12:51:01.334427 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a68f942-f23a-4131-b62a-70f726cee074" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334449 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a68f942-f23a-4131-b62a-70f726cee074" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: E1203 12:51:01.334467 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334498 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: E1203 12:51:01.334512 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334523 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: E1203 12:51:01.334545 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334552 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: E1203 12:51:01.334598 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334605 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: E1203 12:51:01.334614 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a68f942-f23a-4131-b62a-70f726cee074" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334622 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a68f942-f23a-4131-b62a-70f726cee074" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334837 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334854 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334869 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bd59da-4fe4-4f9d-9b5c-c14066faba60" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334920 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a68f942-f23a-4131-b62a-70f726cee074" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334930 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" containerName="glance-log" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.334941 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a68f942-f23a-4131-b62a-70f726cee074" containerName="glance-httpd" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.335544 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.390834 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance8237-account-delete-56hbg"] Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.452712 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3227ef64-dd40-4c8d-a0e7-056c9c575165-operator-scripts\") pod \"glance8237-account-delete-56hbg\" (UID: \"3227ef64-dd40-4c8d-a0e7-056c9c575165\") " pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.452810 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssx6f\" (UniqueName: \"kubernetes.io/projected/3227ef64-dd40-4c8d-a0e7-056c9c575165-kube-api-access-ssx6f\") pod \"glance8237-account-delete-56hbg\" (UID: \"3227ef64-dd40-4c8d-a0e7-056c9c575165\") " pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.554115 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3227ef64-dd40-4c8d-a0e7-056c9c575165-operator-scripts\") pod \"glance8237-account-delete-56hbg\" (UID: \"3227ef64-dd40-4c8d-a0e7-056c9c575165\") " pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.554502 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssx6f\" (UniqueName: \"kubernetes.io/projected/3227ef64-dd40-4c8d-a0e7-056c9c575165-kube-api-access-ssx6f\") pod \"glance8237-account-delete-56hbg\" (UID: \"3227ef64-dd40-4c8d-a0e7-056c9c575165\") " pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.555517 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3227ef64-dd40-4c8d-a0e7-056c9c575165-operator-scripts\") pod \"glance8237-account-delete-56hbg\" (UID: \"3227ef64-dd40-4c8d-a0e7-056c9c575165\") " pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.574884 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssx6f\" (UniqueName: \"kubernetes.io/projected/3227ef64-dd40-4c8d-a0e7-056c9c575165-kube-api-access-ssx6f\") pod \"glance8237-account-delete-56hbg\" (UID: \"3227ef64-dd40-4c8d-a0e7-056c9c575165\") " pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.651480 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.829126 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70e886bc-d0fc-4024-a751-3bc1e4233c3d" path="/var/lib/kubelet/pods/70e886bc-d0fc-4024-a751-3bc1e4233c3d/volumes" Dec 03 12:51:01 crc kubenswrapper[4711]: I1203 12:51:01.830767 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62c935e-dea8-493a-a087-7cbf5c2a9542" path="/var/lib/kubelet/pods/d62c935e-dea8-493a-a087-7cbf5c2a9542/volumes" Dec 03 12:51:02 crc kubenswrapper[4711]: I1203 12:51:02.060864 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance8237-account-delete-56hbg"] Dec 03 12:51:02 crc kubenswrapper[4711]: I1203 12:51:02.297929 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" event={"ID":"3227ef64-dd40-4c8d-a0e7-056c9c575165","Type":"ContainerStarted","Data":"6df0dc0ea55c01dcfd7b5c6f8501d91d49b25669cba8966664833c8117515079"} Dec 03 12:51:02 crc kubenswrapper[4711]: I1203 12:51:02.297979 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" event={"ID":"3227ef64-dd40-4c8d-a0e7-056c9c575165","Type":"ContainerStarted","Data":"981275034e0ea28b381bd6d43996a844c36cd4006c366b42d6f4d6b26f735db0"} Dec 03 12:51:02 crc kubenswrapper[4711]: I1203 12:51:02.317712 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" podStartSLOduration=1.3176938489999999 podStartE2EDuration="1.317693849s" podCreationTimestamp="2025-12-03 12:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:02.309569687 +0000 UTC m=+2180.978820952" watchObservedRunningTime="2025-12-03 12:51:02.317693849 +0000 UTC m=+2180.986945114" Dec 03 12:51:03 crc kubenswrapper[4711]: I1203 12:51:03.305838 4711 generic.go:334] "Generic (PLEG): container finished" podID="3227ef64-dd40-4c8d-a0e7-056c9c575165" containerID="6df0dc0ea55c01dcfd7b5c6f8501d91d49b25669cba8966664833c8117515079" exitCode=0 Dec 03 12:51:03 crc kubenswrapper[4711]: I1203 12:51:03.305897 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" event={"ID":"3227ef64-dd40-4c8d-a0e7-056c9c575165","Type":"ContainerDied","Data":"6df0dc0ea55c01dcfd7b5c6f8501d91d49b25669cba8966664833c8117515079"} Dec 03 12:51:04 crc kubenswrapper[4711]: I1203 12:51:04.667161 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:04 crc kubenswrapper[4711]: I1203 12:51:04.802487 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssx6f\" (UniqueName: \"kubernetes.io/projected/3227ef64-dd40-4c8d-a0e7-056c9c575165-kube-api-access-ssx6f\") pod \"3227ef64-dd40-4c8d-a0e7-056c9c575165\" (UID: \"3227ef64-dd40-4c8d-a0e7-056c9c575165\") " Dec 03 12:51:04 crc kubenswrapper[4711]: I1203 12:51:04.802607 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3227ef64-dd40-4c8d-a0e7-056c9c575165-operator-scripts\") pod \"3227ef64-dd40-4c8d-a0e7-056c9c575165\" (UID: \"3227ef64-dd40-4c8d-a0e7-056c9c575165\") " Dec 03 12:51:04 crc kubenswrapper[4711]: I1203 12:51:04.803698 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3227ef64-dd40-4c8d-a0e7-056c9c575165-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3227ef64-dd40-4c8d-a0e7-056c9c575165" (UID: "3227ef64-dd40-4c8d-a0e7-056c9c575165"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:51:04 crc kubenswrapper[4711]: I1203 12:51:04.807888 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3227ef64-dd40-4c8d-a0e7-056c9c575165-kube-api-access-ssx6f" (OuterVolumeSpecName: "kube-api-access-ssx6f") pod "3227ef64-dd40-4c8d-a0e7-056c9c575165" (UID: "3227ef64-dd40-4c8d-a0e7-056c9c575165"). InnerVolumeSpecName "kube-api-access-ssx6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:04 crc kubenswrapper[4711]: I1203 12:51:04.904664 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssx6f\" (UniqueName: \"kubernetes.io/projected/3227ef64-dd40-4c8d-a0e7-056c9c575165-kube-api-access-ssx6f\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:04 crc kubenswrapper[4711]: I1203 12:51:04.904694 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3227ef64-dd40-4c8d-a0e7-056c9c575165-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:05 crc kubenswrapper[4711]: I1203 12:51:05.323048 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" event={"ID":"3227ef64-dd40-4c8d-a0e7-056c9c575165","Type":"ContainerDied","Data":"981275034e0ea28b381bd6d43996a844c36cd4006c366b42d6f4d6b26f735db0"} Dec 03 12:51:05 crc kubenswrapper[4711]: I1203 12:51:05.323092 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="981275034e0ea28b381bd6d43996a844c36cd4006c366b42d6f4d6b26f735db0" Dec 03 12:51:05 crc kubenswrapper[4711]: I1203 12:51:05.323160 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance8237-account-delete-56hbg" Dec 03 12:51:05 crc kubenswrapper[4711]: I1203 12:51:05.402040 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:51:05 crc kubenswrapper[4711]: I1203 12:51:05.402122 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.374782 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-lbmrn"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.409417 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-lbmrn"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.434880 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-8237-account-create-update-48lr9"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.442690 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-8237-account-create-update-48lr9"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.457186 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance8237-account-delete-56hbg"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.467073 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance8237-account-delete-56hbg"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.475044 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-k4hnl"] Dec 03 12:51:06 crc kubenswrapper[4711]: E1203 12:51:06.475482 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3227ef64-dd40-4c8d-a0e7-056c9c575165" containerName="mariadb-account-delete" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.475509 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="3227ef64-dd40-4c8d-a0e7-056c9c575165" containerName="mariadb-account-delete" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.475709 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="3227ef64-dd40-4c8d-a0e7-056c9c575165" containerName="mariadb-account-delete" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.476510 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.494062 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-k4hnl"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.569109 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-3235-account-create-update-fcx9z"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.570012 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.572023 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.583017 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3235-account-create-update-fcx9z"] Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.637765 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxsws\" (UniqueName: \"kubernetes.io/projected/31e8985d-2ba9-413c-a0dd-860084f4fac5-kube-api-access-xxsws\") pod \"glance-db-create-k4hnl\" (UID: \"31e8985d-2ba9-413c-a0dd-860084f4fac5\") " pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.637849 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e8985d-2ba9-413c-a0dd-860084f4fac5-operator-scripts\") pod \"glance-db-create-k4hnl\" (UID: \"31e8985d-2ba9-413c-a0dd-860084f4fac5\") " pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.739652 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxp8w\" (UniqueName: \"kubernetes.io/projected/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-kube-api-access-cxp8w\") pod \"glance-3235-account-create-update-fcx9z\" (UID: \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\") " pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.739817 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxsws\" (UniqueName: \"kubernetes.io/projected/31e8985d-2ba9-413c-a0dd-860084f4fac5-kube-api-access-xxsws\") pod \"glance-db-create-k4hnl\" (UID: \"31e8985d-2ba9-413c-a0dd-860084f4fac5\") " pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.739857 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-operator-scripts\") pod \"glance-3235-account-create-update-fcx9z\" (UID: \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\") " pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.739903 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e8985d-2ba9-413c-a0dd-860084f4fac5-operator-scripts\") pod \"glance-db-create-k4hnl\" (UID: \"31e8985d-2ba9-413c-a0dd-860084f4fac5\") " pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.741120 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e8985d-2ba9-413c-a0dd-860084f4fac5-operator-scripts\") pod \"glance-db-create-k4hnl\" (UID: \"31e8985d-2ba9-413c-a0dd-860084f4fac5\") " pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.760153 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxsws\" (UniqueName: \"kubernetes.io/projected/31e8985d-2ba9-413c-a0dd-860084f4fac5-kube-api-access-xxsws\") pod \"glance-db-create-k4hnl\" (UID: \"31e8985d-2ba9-413c-a0dd-860084f4fac5\") " pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.799071 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.841679 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-operator-scripts\") pod \"glance-3235-account-create-update-fcx9z\" (UID: \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\") " pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.841821 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxp8w\" (UniqueName: \"kubernetes.io/projected/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-kube-api-access-cxp8w\") pod \"glance-3235-account-create-update-fcx9z\" (UID: \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\") " pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.842601 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-operator-scripts\") pod \"glance-3235-account-create-update-fcx9z\" (UID: \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\") " pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.871487 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxp8w\" (UniqueName: \"kubernetes.io/projected/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-kube-api-access-cxp8w\") pod \"glance-3235-account-create-update-fcx9z\" (UID: \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\") " pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:06 crc kubenswrapper[4711]: I1203 12:51:06.891510 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:07 crc kubenswrapper[4711]: I1203 12:51:07.272823 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-k4hnl"] Dec 03 12:51:07 crc kubenswrapper[4711]: I1203 12:51:07.345683 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-k4hnl" event={"ID":"31e8985d-2ba9-413c-a0dd-860084f4fac5","Type":"ContainerStarted","Data":"df03060fc7ce7419ead9d48c649962d7e06fa3953a79d7a5a09de11233788376"} Dec 03 12:51:07 crc kubenswrapper[4711]: I1203 12:51:07.355910 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3235-account-create-update-fcx9z"] Dec 03 12:51:07 crc kubenswrapper[4711]: W1203 12:51:07.358316 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc28c016a_8dd3_4a25_9d3e_a25bdcc511bd.slice/crio-be34bcd502f3633852b8c409a9a79e2f9299bf64070e7dc4466e74a023ed139b WatchSource:0}: Error finding container be34bcd502f3633852b8c409a9a79e2f9299bf64070e7dc4466e74a023ed139b: Status 404 returned error can't find the container with id be34bcd502f3633852b8c409a9a79e2f9299bf64070e7dc4466e74a023ed139b Dec 03 12:51:07 crc kubenswrapper[4711]: I1203 12:51:07.827188 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9a14a1-1d89-4eab-8aee-10d3c98ce90f" path="/var/lib/kubelet/pods/0c9a14a1-1d89-4eab-8aee-10d3c98ce90f/volumes" Dec 03 12:51:07 crc kubenswrapper[4711]: I1203 12:51:07.828036 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c7376b2-9de2-4c0f-8311-8e422b08f2cb" path="/var/lib/kubelet/pods/1c7376b2-9de2-4c0f-8311-8e422b08f2cb/volumes" Dec 03 12:51:07 crc kubenswrapper[4711]: I1203 12:51:07.828478 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3227ef64-dd40-4c8d-a0e7-056c9c575165" path="/var/lib/kubelet/pods/3227ef64-dd40-4c8d-a0e7-056c9c575165/volumes" Dec 03 12:51:08 crc kubenswrapper[4711]: I1203 12:51:08.359299 4711 generic.go:334] "Generic (PLEG): container finished" podID="c28c016a-8dd3-4a25-9d3e-a25bdcc511bd" containerID="8642c90613c5ec654cbdb083f5b9203693c9cd0e2a0e928a42441d179c5e2162" exitCode=0 Dec 03 12:51:08 crc kubenswrapper[4711]: I1203 12:51:08.359393 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" event={"ID":"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd","Type":"ContainerDied","Data":"8642c90613c5ec654cbdb083f5b9203693c9cd0e2a0e928a42441d179c5e2162"} Dec 03 12:51:08 crc kubenswrapper[4711]: I1203 12:51:08.359485 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" event={"ID":"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd","Type":"ContainerStarted","Data":"be34bcd502f3633852b8c409a9a79e2f9299bf64070e7dc4466e74a023ed139b"} Dec 03 12:51:08 crc kubenswrapper[4711]: I1203 12:51:08.370824 4711 generic.go:334] "Generic (PLEG): container finished" podID="31e8985d-2ba9-413c-a0dd-860084f4fac5" containerID="6baa336b4bc24571c7bee7a979cd709c72b3854c08c6077dad7bab78b7307fdf" exitCode=0 Dec 03 12:51:08 crc kubenswrapper[4711]: I1203 12:51:08.370881 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-k4hnl" event={"ID":"31e8985d-2ba9-413c-a0dd-860084f4fac5","Type":"ContainerDied","Data":"6baa336b4bc24571c7bee7a979cd709c72b3854c08c6077dad7bab78b7307fdf"} Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.733225 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.737948 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.889231 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-operator-scripts\") pod \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\" (UID: \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\") " Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.889270 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxsws\" (UniqueName: \"kubernetes.io/projected/31e8985d-2ba9-413c-a0dd-860084f4fac5-kube-api-access-xxsws\") pod \"31e8985d-2ba9-413c-a0dd-860084f4fac5\" (UID: \"31e8985d-2ba9-413c-a0dd-860084f4fac5\") " Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.889329 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e8985d-2ba9-413c-a0dd-860084f4fac5-operator-scripts\") pod \"31e8985d-2ba9-413c-a0dd-860084f4fac5\" (UID: \"31e8985d-2ba9-413c-a0dd-860084f4fac5\") " Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.889422 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxp8w\" (UniqueName: \"kubernetes.io/projected/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-kube-api-access-cxp8w\") pod \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\" (UID: \"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd\") " Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.890265 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c28c016a-8dd3-4a25-9d3e-a25bdcc511bd" (UID: "c28c016a-8dd3-4a25-9d3e-a25bdcc511bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.890405 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e8985d-2ba9-413c-a0dd-860084f4fac5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31e8985d-2ba9-413c-a0dd-860084f4fac5" (UID: "31e8985d-2ba9-413c-a0dd-860084f4fac5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.894574 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-kube-api-access-cxp8w" (OuterVolumeSpecName: "kube-api-access-cxp8w") pod "c28c016a-8dd3-4a25-9d3e-a25bdcc511bd" (UID: "c28c016a-8dd3-4a25-9d3e-a25bdcc511bd"). InnerVolumeSpecName "kube-api-access-cxp8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.895747 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e8985d-2ba9-413c-a0dd-860084f4fac5-kube-api-access-xxsws" (OuterVolumeSpecName: "kube-api-access-xxsws") pod "31e8985d-2ba9-413c-a0dd-860084f4fac5" (UID: "31e8985d-2ba9-413c-a0dd-860084f4fac5"). InnerVolumeSpecName "kube-api-access-xxsws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.991345 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.991382 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxsws\" (UniqueName: \"kubernetes.io/projected/31e8985d-2ba9-413c-a0dd-860084f4fac5-kube-api-access-xxsws\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.991394 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e8985d-2ba9-413c-a0dd-860084f4fac5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:09 crc kubenswrapper[4711]: I1203 12:51:09.991404 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxp8w\" (UniqueName: \"kubernetes.io/projected/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd-kube-api-access-cxp8w\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:10 crc kubenswrapper[4711]: I1203 12:51:10.393430 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" event={"ID":"c28c016a-8dd3-4a25-9d3e-a25bdcc511bd","Type":"ContainerDied","Data":"be34bcd502f3633852b8c409a9a79e2f9299bf64070e7dc4466e74a023ed139b"} Dec 03 12:51:10 crc kubenswrapper[4711]: I1203 12:51:10.393486 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be34bcd502f3633852b8c409a9a79e2f9299bf64070e7dc4466e74a023ed139b" Dec 03 12:51:10 crc kubenswrapper[4711]: I1203 12:51:10.393495 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3235-account-create-update-fcx9z" Dec 03 12:51:10 crc kubenswrapper[4711]: I1203 12:51:10.395697 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-k4hnl" event={"ID":"31e8985d-2ba9-413c-a0dd-860084f4fac5","Type":"ContainerDied","Data":"df03060fc7ce7419ead9d48c649962d7e06fa3953a79d7a5a09de11233788376"} Dec 03 12:51:10 crc kubenswrapper[4711]: I1203 12:51:10.395746 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df03060fc7ce7419ead9d48c649962d7e06fa3953a79d7a5a09de11233788376" Dec 03 12:51:10 crc kubenswrapper[4711]: I1203 12:51:10.395789 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-k4hnl" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.713815 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-dn7gg"] Dec 03 12:51:11 crc kubenswrapper[4711]: E1203 12:51:11.714661 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e8985d-2ba9-413c-a0dd-860084f4fac5" containerName="mariadb-database-create" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.714685 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e8985d-2ba9-413c-a0dd-860084f4fac5" containerName="mariadb-database-create" Dec 03 12:51:11 crc kubenswrapper[4711]: E1203 12:51:11.714785 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28c016a-8dd3-4a25-9d3e-a25bdcc511bd" containerName="mariadb-account-create-update" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.714802 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28c016a-8dd3-4a25-9d3e-a25bdcc511bd" containerName="mariadb-account-create-update" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.715144 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e8985d-2ba9-413c-a0dd-860084f4fac5" containerName="mariadb-database-create" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.715189 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28c016a-8dd3-4a25-9d3e-a25bdcc511bd" containerName="mariadb-account-create-update" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.716010 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.718345 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-9sptl" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.718620 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.720140 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dn7gg"] Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.820359 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-db-sync-config-data\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.820428 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq9g2\" (UniqueName: \"kubernetes.io/projected/8a532310-9c14-45b6-8d03-ae94aa7ecd77-kube-api-access-sq9g2\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.820446 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-config-data\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.922415 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq9g2\" (UniqueName: \"kubernetes.io/projected/8a532310-9c14-45b6-8d03-ae94aa7ecd77-kube-api-access-sq9g2\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.922465 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-config-data\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.922642 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-db-sync-config-data\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.927341 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-db-sync-config-data\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.928108 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-config-data\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:11 crc kubenswrapper[4711]: I1203 12:51:11.957873 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq9g2\" (UniqueName: \"kubernetes.io/projected/8a532310-9c14-45b6-8d03-ae94aa7ecd77-kube-api-access-sq9g2\") pod \"glance-db-sync-dn7gg\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:12 crc kubenswrapper[4711]: I1203 12:51:12.048984 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:12 crc kubenswrapper[4711]: I1203 12:51:12.553019 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dn7gg"] Dec 03 12:51:13 crc kubenswrapper[4711]: I1203 12:51:13.430955 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dn7gg" event={"ID":"8a532310-9c14-45b6-8d03-ae94aa7ecd77","Type":"ContainerStarted","Data":"2f0b6add244cc20c53f8a0ac35ea7317a2cd7e7f292ea8a90caa41ba4b72d187"} Dec 03 12:51:13 crc kubenswrapper[4711]: I1203 12:51:13.431315 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dn7gg" event={"ID":"8a532310-9c14-45b6-8d03-ae94aa7ecd77","Type":"ContainerStarted","Data":"696572fdd7c919b9f859f94d8b0f52798957c12a66d11d0b81a3f444011f7881"} Dec 03 12:51:13 crc kubenswrapper[4711]: I1203 12:51:13.446499 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-dn7gg" podStartSLOduration=2.446478745 podStartE2EDuration="2.446478745s" podCreationTimestamp="2025-12-03 12:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:13.446427494 +0000 UTC m=+2192.115678769" watchObservedRunningTime="2025-12-03 12:51:13.446478745 +0000 UTC m=+2192.115730000" Dec 03 12:51:16 crc kubenswrapper[4711]: I1203 12:51:16.465463 4711 generic.go:334] "Generic (PLEG): container finished" podID="8a532310-9c14-45b6-8d03-ae94aa7ecd77" containerID="2f0b6add244cc20c53f8a0ac35ea7317a2cd7e7f292ea8a90caa41ba4b72d187" exitCode=0 Dec 03 12:51:16 crc kubenswrapper[4711]: I1203 12:51:16.465639 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dn7gg" event={"ID":"8a532310-9c14-45b6-8d03-ae94aa7ecd77","Type":"ContainerDied","Data":"2f0b6add244cc20c53f8a0ac35ea7317a2cd7e7f292ea8a90caa41ba4b72d187"} Dec 03 12:51:17 crc kubenswrapper[4711]: I1203 12:51:17.848178 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.025812 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq9g2\" (UniqueName: \"kubernetes.io/projected/8a532310-9c14-45b6-8d03-ae94aa7ecd77-kube-api-access-sq9g2\") pod \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.026197 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-config-data\") pod \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.026279 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-db-sync-config-data\") pod \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\" (UID: \"8a532310-9c14-45b6-8d03-ae94aa7ecd77\") " Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.030885 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a532310-9c14-45b6-8d03-ae94aa7ecd77-kube-api-access-sq9g2" (OuterVolumeSpecName: "kube-api-access-sq9g2") pod "8a532310-9c14-45b6-8d03-ae94aa7ecd77" (UID: "8a532310-9c14-45b6-8d03-ae94aa7ecd77"). InnerVolumeSpecName "kube-api-access-sq9g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.032351 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8a532310-9c14-45b6-8d03-ae94aa7ecd77" (UID: "8a532310-9c14-45b6-8d03-ae94aa7ecd77"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.087240 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-config-data" (OuterVolumeSpecName: "config-data") pod "8a532310-9c14-45b6-8d03-ae94aa7ecd77" (UID: "8a532310-9c14-45b6-8d03-ae94aa7ecd77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.128075 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.128106 4711 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a532310-9c14-45b6-8d03-ae94aa7ecd77-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.128117 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq9g2\" (UniqueName: \"kubernetes.io/projected/8a532310-9c14-45b6-8d03-ae94aa7ecd77-kube-api-access-sq9g2\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.491373 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dn7gg" event={"ID":"8a532310-9c14-45b6-8d03-ae94aa7ecd77","Type":"ContainerDied","Data":"696572fdd7c919b9f859f94d8b0f52798957c12a66d11d0b81a3f444011f7881"} Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.491427 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="696572fdd7c919b9f859f94d8b0f52798957c12a66d11d0b81a3f444011f7881" Dec 03 12:51:18 crc kubenswrapper[4711]: I1203 12:51:18.491483 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dn7gg" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.666354 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:51:19 crc kubenswrapper[4711]: E1203 12:51:19.666965 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a532310-9c14-45b6-8d03-ae94aa7ecd77" containerName="glance-db-sync" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.666980 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a532310-9c14-45b6-8d03-ae94aa7ecd77" containerName="glance-db-sync" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.667118 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a532310-9c14-45b6-8d03-ae94aa7ecd77" containerName="glance-db-sync" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.667992 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.670611 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.670878 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-9sptl" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.680970 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.683829 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753062 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753126 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt2dz\" (UniqueName: \"kubernetes.io/projected/75fbd2cd-656d-4547-bd68-2b679f418780-kube-api-access-zt2dz\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753159 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753190 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753224 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753256 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-logs\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753276 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-dev\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753304 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753331 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753352 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-run\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753464 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-config-data\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753502 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753541 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-scripts\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.753593 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-sys\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.854884 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.854970 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt2dz\" (UniqueName: \"kubernetes.io/projected/75fbd2cd-656d-4547-bd68-2b679f418780-kube-api-access-zt2dz\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855003 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855029 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855063 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855090 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-logs\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855109 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-dev\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855132 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855157 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855176 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-run\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855213 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-config-data\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855236 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855259 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-scripts\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855283 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-sys\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855384 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-sys\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.855751 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.856063 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.856143 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.856744 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.856825 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-run\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.856862 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-dev\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.856927 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.857017 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.857106 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.857219 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-logs\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.862700 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-config-data\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.866602 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-scripts\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.873742 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt2dz\" (UniqueName: \"kubernetes.io/projected/75fbd2cd-656d-4547-bd68-2b679f418780-kube-api-access-zt2dz\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.884739 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.886633 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-1\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.890786 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.892363 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.914091 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.950406 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.952266 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.955705 4711 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.956645 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957502 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957563 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-config-data\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957599 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-scripts\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957628 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957662 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-sys\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957678 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957693 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr9w\" (UniqueName: \"kubernetes.io/projected/831817e3-651d-47a1-ae54-ddebe2dfc17b-kube-api-access-fcr9w\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957711 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-dev\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957727 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957746 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957776 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-logs\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957793 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957809 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-run\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.957823 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.967872 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.968994 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.991945 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:19 crc kubenswrapper[4711]: I1203 12:51:19.997966 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060368 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-scripts\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060435 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060477 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-sys\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060501 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060525 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr9w\" (UniqueName: \"kubernetes.io/projected/831817e3-651d-47a1-ae54-ddebe2dfc17b-kube-api-access-fcr9w\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060554 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-dev\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060576 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060601 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060636 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-logs\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060657 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060678 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-run\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060697 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060732 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.060765 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-config-data\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.063753 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.068317 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.068393 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.068459 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-run\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.068492 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-sys\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.068747 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.068861 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.069058 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.069104 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-logs\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.069128 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-dev\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.069154 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.071187 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-config-data\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.089528 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-scripts\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.089677 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.093060 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr9w\" (UniqueName: \"kubernetes.io/projected/831817e3-651d-47a1-ae54-ddebe2dfc17b-kube-api-access-fcr9w\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.102198 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162442 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-dev\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162489 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-run\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162511 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162528 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162549 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162566 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162586 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162601 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162622 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162639 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162654 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162671 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162690 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162706 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-run\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162744 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-config-data\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162760 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-sys\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162775 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162791 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162808 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162840 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162860 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-scripts\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162881 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-dev\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162896 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162928 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkrjb\" (UniqueName: \"kubernetes.io/projected/e4204740-8b38-44c3-b8cf-c8eb781c044a-kube-api-access-lkrjb\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162943 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pcm4\" (UniqueName: \"kubernetes.io/projected/5e0c59cd-859f-45b6-be89-da265a062ceb-kube-api-access-7pcm4\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162957 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-sys\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162970 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-logs\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.162989 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.256024 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.263896 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-scripts\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.263970 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-dev\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.263998 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264028 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkrjb\" (UniqueName: \"kubernetes.io/projected/e4204740-8b38-44c3-b8cf-c8eb781c044a-kube-api-access-lkrjb\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264049 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pcm4\" (UniqueName: \"kubernetes.io/projected/5e0c59cd-859f-45b6-be89-da265a062ceb-kube-api-access-7pcm4\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264071 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-sys\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264091 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-logs\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264043 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-dev\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264119 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264167 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-dev\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264192 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-run\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264216 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264234 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264259 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264279 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264302 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264322 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264328 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-run\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264350 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264374 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264394 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264417 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264445 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264456 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264467 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-run\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264506 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-run\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264584 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-config-data\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264617 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-sys\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264637 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264669 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264694 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264759 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264771 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-logs\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264952 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264998 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265009 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.264376 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-sys\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265065 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265104 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265145 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265235 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265416 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265417 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265463 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265464 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265498 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-dev\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265501 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265514 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-sys\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265558 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.265537 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.267925 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.270227 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.271279 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-config-data\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.273828 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-scripts\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.280888 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkrjb\" (UniqueName: \"kubernetes.io/projected/e4204740-8b38-44c3-b8cf-c8eb781c044a-kube-api-access-lkrjb\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.284774 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pcm4\" (UniqueName: \"kubernetes.io/projected/5e0c59cd-859f-45b6-be89-da265a062ceb-kube-api-access-7pcm4\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.291848 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.292846 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.298791 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.301490 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.436807 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.502620 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.506888 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"75fbd2cd-656d-4547-bd68-2b679f418780","Type":"ContainerStarted","Data":"293a39ddd131945d7813aa8f8bbd4156e5e053db87f1e93de0a3328868ab1b0d"} Dec 03 12:51:20 crc kubenswrapper[4711]: W1203 12:51:20.509297 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod831817e3_651d_47a1_ae54_ddebe2dfc17b.slice/crio-dc7394f3e5a1010e32bfdbedec4c461d689de633896a7a31d561274b7a5f2031 WatchSource:0}: Error finding container dc7394f3e5a1010e32bfdbedec4c461d689de633896a7a31d561274b7a5f2031: Status 404 returned error can't find the container with id dc7394f3e5a1010e32bfdbedec4c461d689de633896a7a31d561274b7a5f2031 Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.528472 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.529568 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.570975 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:20 crc kubenswrapper[4711]: I1203 12:51:20.995445 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.115579 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.517218 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"5e0c59cd-859f-45b6-be89-da265a062ceb","Type":"ContainerStarted","Data":"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.517568 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"5e0c59cd-859f-45b6-be89-da265a062ceb","Type":"ContainerStarted","Data":"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.517585 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"5e0c59cd-859f-45b6-be89-da265a062ceb","Type":"ContainerStarted","Data":"3f38ab0c8305ef95d025b58ff867e0d08ca6ddedeaa34d0b80fcd7255b82e34b"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.517718 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerName="glance-log" containerID="cri-o://be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040" gracePeriod=30 Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.518342 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerName="glance-httpd" containerID="cri-o://e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c" gracePeriod=30 Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.523624 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4204740-8b38-44c3-b8cf-c8eb781c044a","Type":"ContainerStarted","Data":"1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.523719 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4204740-8b38-44c3-b8cf-c8eb781c044a","Type":"ContainerStarted","Data":"703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.523752 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4204740-8b38-44c3-b8cf-c8eb781c044a","Type":"ContainerStarted","Data":"7ab2c02362f954492a0f67267f24509e3ca2f625499183df20d186b3e9a49ba4"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.529203 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"75fbd2cd-656d-4547-bd68-2b679f418780","Type":"ContainerStarted","Data":"bdcfcd5dde4ffac8e4b1493dbbbfd60a6e12d59e046542812db8f50d8cf65d06"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.529239 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"75fbd2cd-656d-4547-bd68-2b679f418780","Type":"ContainerStarted","Data":"355f127ffd8448aceba29c3b05d09b59993e407776d176dd53889aba772c6a23"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.533783 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"831817e3-651d-47a1-ae54-ddebe2dfc17b","Type":"ContainerStarted","Data":"d7e0067c41e3541b0cf6db6a58c6265429bcf519b06baef12ed838f0d1a76e0c"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.533810 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"831817e3-651d-47a1-ae54-ddebe2dfc17b","Type":"ContainerStarted","Data":"82b95992e011935e47f24709e1b2d8c7f9f82404005dd0917e07203a91416dca"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.533822 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"831817e3-651d-47a1-ae54-ddebe2dfc17b","Type":"ContainerStarted","Data":"dc7394f3e5a1010e32bfdbedec4c461d689de633896a7a31d561274b7a5f2031"} Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.548124 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.548079205 podStartE2EDuration="3.548079205s" podCreationTimestamp="2025-12-03 12:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:21.542503463 +0000 UTC m=+2200.211754758" watchObservedRunningTime="2025-12-03 12:51:21.548079205 +0000 UTC m=+2200.217330480" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.570852 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=2.570835697 podStartE2EDuration="2.570835697s" podCreationTimestamp="2025-12-03 12:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:21.567967409 +0000 UTC m=+2200.237218674" watchObservedRunningTime="2025-12-03 12:51:21.570835697 +0000 UTC m=+2200.240086952" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.601250 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.601221968 podStartE2EDuration="3.601221968s" podCreationTimestamp="2025-12-03 12:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:21.596237792 +0000 UTC m=+2200.265489067" watchObservedRunningTime="2025-12-03 12:51:21.601221968 +0000 UTC m=+2200.270473233" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.614699 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.614635004 podStartE2EDuration="3.614635004s" podCreationTimestamp="2025-12-03 12:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:21.612915228 +0000 UTC m=+2200.282166483" watchObservedRunningTime="2025-12-03 12:51:21.614635004 +0000 UTC m=+2200.283886259" Dec 03 12:51:21 crc kubenswrapper[4711]: E1203 12:51:21.694175 4711 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0c59cd_859f_45b6_be89_da265a062ceb.slice/crio-conmon-be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0c59cd_859f_45b6_be89_da265a062ceb.slice/crio-conmon-e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0c59cd_859f_45b6_be89_da265a062ceb.slice/crio-e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.863559 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.909826 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-config-data\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.909932 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-run\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910080 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-dev\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910123 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-httpd-run\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910181 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910245 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-scripts\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910305 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-sys\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910353 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-logs\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910426 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910458 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-nvme\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910491 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pcm4\" (UniqueName: \"kubernetes.io/projected/5e0c59cd-859f-45b6-be89-da265a062ceb-kube-api-access-7pcm4\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910538 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-lib-modules\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910585 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-var-locks-brick\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.910658 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-iscsi\") pod \"5e0c59cd-859f-45b6-be89-da265a062ceb\" (UID: \"5e0c59cd-859f-45b6-be89-da265a062ceb\") " Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.912047 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.912163 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-run" (OuterVolumeSpecName: "run") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.912214 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.912293 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-dev" (OuterVolumeSpecName: "dev") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.912374 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-sys" (OuterVolumeSpecName: "sys") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.912418 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.912459 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.912501 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.913070 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-logs" (OuterVolumeSpecName: "logs") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.919069 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.919188 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-scripts" (OuterVolumeSpecName: "scripts") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.919213 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0c59cd-859f-45b6-be89-da265a062ceb-kube-api-access-7pcm4" (OuterVolumeSpecName: "kube-api-access-7pcm4") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "kube-api-access-7pcm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.923084 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:51:21 crc kubenswrapper[4711]: I1203 12:51:21.958289 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-config-data" (OuterVolumeSpecName: "config-data") pod "5e0c59cd-859f-45b6-be89-da265a062ceb" (UID: "5e0c59cd-859f-45b6-be89-da265a062ceb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013303 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013363 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013381 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013399 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013418 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013434 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013485 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013503 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0c59cd-859f-45b6-be89-da265a062ceb-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013521 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0c59cd-859f-45b6-be89-da265a062ceb-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013538 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013565 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013582 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013599 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pcm4\" (UniqueName: \"kubernetes.io/projected/5e0c59cd-859f-45b6-be89-da265a062ceb-kube-api-access-7pcm4\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.013618 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e0c59cd-859f-45b6-be89-da265a062ceb-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.044849 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.045712 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.115090 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.115121 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.549583 4711 generic.go:334] "Generic (PLEG): container finished" podID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerID="e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c" exitCode=143 Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.549863 4711 generic.go:334] "Generic (PLEG): container finished" podID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerID="be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040" exitCode=143 Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.549778 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"5e0c59cd-859f-45b6-be89-da265a062ceb","Type":"ContainerDied","Data":"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c"} Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.549979 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"5e0c59cd-859f-45b6-be89-da265a062ceb","Type":"ContainerDied","Data":"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040"} Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.550006 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"5e0c59cd-859f-45b6-be89-da265a062ceb","Type":"ContainerDied","Data":"3f38ab0c8305ef95d025b58ff867e0d08ca6ddedeaa34d0b80fcd7255b82e34b"} Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.550035 4711 scope.go:117] "RemoveContainer" containerID="e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.549792 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.597177 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.598842 4711 scope.go:117] "RemoveContainer" containerID="be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.611303 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.628199 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:22 crc kubenswrapper[4711]: E1203 12:51:22.628543 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerName="glance-log" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.628564 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerName="glance-log" Dec 03 12:51:22 crc kubenswrapper[4711]: E1203 12:51:22.628592 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerName="glance-httpd" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.628602 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerName="glance-httpd" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.628779 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerName="glance-httpd" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.628815 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" containerName="glance-log" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.629901 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.638843 4711 scope.go:117] "RemoveContainer" containerID="e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c" Dec 03 12:51:22 crc kubenswrapper[4711]: E1203 12:51:22.641414 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c\": container with ID starting with e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c not found: ID does not exist" containerID="e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.641464 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c"} err="failed to get container status \"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c\": rpc error: code = NotFound desc = could not find container \"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c\": container with ID starting with e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c not found: ID does not exist" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.641494 4711 scope.go:117] "RemoveContainer" containerID="be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040" Dec 03 12:51:22 crc kubenswrapper[4711]: E1203 12:51:22.643014 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040\": container with ID starting with be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040 not found: ID does not exist" containerID="be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.643053 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040"} err="failed to get container status \"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040\": rpc error: code = NotFound desc = could not find container \"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040\": container with ID starting with be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040 not found: ID does not exist" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.643079 4711 scope.go:117] "RemoveContainer" containerID="e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.643349 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.643501 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c"} err="failed to get container status \"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c\": rpc error: code = NotFound desc = could not find container \"e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c\": container with ID starting with e9cf7727ab7143ab0975c7e9cbc847518e7b19fcd9ee5915ce9cc5e85c42db2c not found: ID does not exist" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.643548 4711 scope.go:117] "RemoveContainer" containerID="be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.644159 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040"} err="failed to get container status \"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040\": rpc error: code = NotFound desc = could not find container \"be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040\": container with ID starting with be9b8b1d7bc90bd4e874ec2ab971a302c112ac38ff334daaadf083c77478a040 not found: ID does not exist" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826406 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826470 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826498 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826516 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2hmm\" (UniqueName: \"kubernetes.io/projected/2256b30d-1e4e-410f-8b81-6e813009b36d-kube-api-access-v2hmm\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826543 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-sys\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826566 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-logs\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826638 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826689 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826760 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826840 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.826870 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.827072 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-dev\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.827157 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-run\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.827226 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929197 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929301 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929349 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929386 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929410 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2hmm\" (UniqueName: \"kubernetes.io/projected/2256b30d-1e4e-410f-8b81-6e813009b36d-kube-api-access-v2hmm\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929437 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-sys\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929466 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-logs\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929490 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929517 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929551 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929581 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929615 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929672 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-dev\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929705 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-run\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929792 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-run\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.929836 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.930061 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.930219 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.930329 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.930377 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-dev\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.930637 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-sys\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.930752 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.930995 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.931120 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-logs\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.931198 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.941502 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.942175 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.959088 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.960885 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2hmm\" (UniqueName: \"kubernetes.io/projected/2256b30d-1e4e-410f-8b81-6e813009b36d-kube-api-access-v2hmm\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:22 crc kubenswrapper[4711]: I1203 12:51:22.967519 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:23 crc kubenswrapper[4711]: I1203 12:51:23.252715 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:23 crc kubenswrapper[4711]: I1203 12:51:23.765727 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:51:23 crc kubenswrapper[4711]: I1203 12:51:23.826680 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0c59cd-859f-45b6-be89-da265a062ceb" path="/var/lib/kubelet/pods/5e0c59cd-859f-45b6-be89-da265a062ceb/volumes" Dec 03 12:51:24 crc kubenswrapper[4711]: I1203 12:51:24.568183 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2256b30d-1e4e-410f-8b81-6e813009b36d","Type":"ContainerStarted","Data":"8f33533d136c3929fa2d671bdfa9d7a0ce52d02a4351886f14fa63c26f77a954"} Dec 03 12:51:24 crc kubenswrapper[4711]: I1203 12:51:24.568734 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2256b30d-1e4e-410f-8b81-6e813009b36d","Type":"ContainerStarted","Data":"b0ecd41e20e4650d351e85127f266cd54f5c082de6a97aea6cce50d32f2658ef"} Dec 03 12:51:24 crc kubenswrapper[4711]: I1203 12:51:24.568749 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2256b30d-1e4e-410f-8b81-6e813009b36d","Type":"ContainerStarted","Data":"03d8eef9aa5b3766ee2458cdb00c7271e109f6ffe7638b3b41a77f35dedb6d95"} Dec 03 12:51:24 crc kubenswrapper[4711]: I1203 12:51:24.608596 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=2.608567691 podStartE2EDuration="2.608567691s" podCreationTimestamp="2025-12-03 12:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:24.596182223 +0000 UTC m=+2203.265433518" watchObservedRunningTime="2025-12-03 12:51:24.608567691 +0000 UTC m=+2203.277818986" Dec 03 12:51:29 crc kubenswrapper[4711]: I1203 12:51:29.992211 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:29 crc kubenswrapper[4711]: I1203 12:51:29.993260 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.033826 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.053802 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.256285 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.256334 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.286998 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.318507 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.571944 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.572036 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.615094 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.627393 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.627455 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.627477 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.627495 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.627519 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:30 crc kubenswrapper[4711]: I1203 12:51:30.635886 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:31 crc kubenswrapper[4711]: I1203 12:51:31.637072 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.563181 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.568594 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.644044 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.652271 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.652413 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.652425 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.652582 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-log" containerID="cri-o://82b95992e011935e47f24709e1b2d8c7f9f82404005dd0917e07203a91416dca" gracePeriod=30 Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.653075 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-httpd" containerID="cri-o://d7e0067c41e3541b0cf6db6a58c6265429bcf519b06baef12ed838f0d1a76e0c" gracePeriod=30 Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.669217 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.677119 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.145:9292/healthcheck\": EOF" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.689592 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.145:9292/healthcheck\": EOF" Dec 03 12:51:32 crc kubenswrapper[4711]: I1203 12:51:32.689625 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.145:9292/healthcheck\": EOF" Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.253935 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.255223 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.283865 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.304407 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.569541 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.653991 4711 generic.go:334] "Generic (PLEG): container finished" podID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerID="82b95992e011935e47f24709e1b2d8c7f9f82404005dd0917e07203a91416dca" exitCode=143 Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.654109 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"831817e3-651d-47a1-ae54-ddebe2dfc17b","Type":"ContainerDied","Data":"82b95992e011935e47f24709e1b2d8c7f9f82404005dd0917e07203a91416dca"} Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.655686 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:33 crc kubenswrapper[4711]: I1203 12:51:33.655727 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:35 crc kubenswrapper[4711]: I1203 12:51:35.401647 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:51:35 crc kubenswrapper[4711]: I1203 12:51:35.402221 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:51:35 crc kubenswrapper[4711]: I1203 12:51:35.567099 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:35 crc kubenswrapper[4711]: I1203 12:51:35.594607 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:51:35 crc kubenswrapper[4711]: I1203 12:51:35.636559 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:35 crc kubenswrapper[4711]: I1203 12:51:35.668427 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerName="glance-log" containerID="cri-o://703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a" gracePeriod=30 Dec 03 12:51:35 crc kubenswrapper[4711]: I1203 12:51:35.669527 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerName="glance-httpd" containerID="cri-o://1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd" gracePeriod=30 Dec 03 12:51:36 crc kubenswrapper[4711]: I1203 12:51:36.686079 4711 generic.go:334] "Generic (PLEG): container finished" podID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerID="703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a" exitCode=143 Dec 03 12:51:36 crc kubenswrapper[4711]: I1203 12:51:36.686183 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4204740-8b38-44c3-b8cf-c8eb781c044a","Type":"ContainerDied","Data":"703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a"} Dec 03 12:51:38 crc kubenswrapper[4711]: I1203 12:51:38.080024 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.145:9292/healthcheck\": read tcp 10.217.0.2:38488->10.217.0.145:9292: read: connection reset by peer" Dec 03 12:51:39 crc kubenswrapper[4711]: I1203 12:51:39.716424 4711 generic.go:334] "Generic (PLEG): container finished" podID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerID="d7e0067c41e3541b0cf6db6a58c6265429bcf519b06baef12ed838f0d1a76e0c" exitCode=0 Dec 03 12:51:39 crc kubenswrapper[4711]: I1203 12:51:39.716496 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"831817e3-651d-47a1-ae54-ddebe2dfc17b","Type":"ContainerDied","Data":"d7e0067c41e3541b0cf6db6a58c6265429bcf519b06baef12ed838f0d1a76e0c"} Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.621499 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.727992 4711 generic.go:334] "Generic (PLEG): container finished" podID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerID="1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd" exitCode=0 Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.728043 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4204740-8b38-44c3-b8cf-c8eb781c044a","Type":"ContainerDied","Data":"1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd"} Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.728085 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e4204740-8b38-44c3-b8cf-c8eb781c044a","Type":"ContainerDied","Data":"7ab2c02362f954492a0f67267f24509e3ca2f625499183df20d186b3e9a49ba4"} Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.728109 4711 scope.go:117] "RemoveContainer" containerID="1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.728332 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.731165 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.734162 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"831817e3-651d-47a1-ae54-ddebe2dfc17b","Type":"ContainerDied","Data":"dc7394f3e5a1010e32bfdbedec4c461d689de633896a7a31d561274b7a5f2031"} Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.772281 4711 scope.go:117] "RemoveContainer" containerID="703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.788992 4711 scope.go:117] "RemoveContainer" containerID="1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd" Dec 03 12:51:40 crc kubenswrapper[4711]: E1203 12:51:40.789379 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd\": container with ID starting with 1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd not found: ID does not exist" containerID="1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.789408 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd"} err="failed to get container status \"1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd\": rpc error: code = NotFound desc = could not find container \"1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd\": container with ID starting with 1934c7377f2258bde7cb124b90a3e4bfb2c03b934e7d8a771a530a9042d7f7fd not found: ID does not exist" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.789430 4711 scope.go:117] "RemoveContainer" containerID="703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a" Dec 03 12:51:40 crc kubenswrapper[4711]: E1203 12:51:40.789732 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a\": container with ID starting with 703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a not found: ID does not exist" containerID="703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.789748 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a"} err="failed to get container status \"703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a\": rpc error: code = NotFound desc = could not find container \"703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a\": container with ID starting with 703e0b660b8b733015591d0752f140232decc1202c8eff0be954ea72ab1acd0a not found: ID does not exist" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.789761 4711 scope.go:117] "RemoveContainer" containerID="d7e0067c41e3541b0cf6db6a58c6265429bcf519b06baef12ed838f0d1a76e0c" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803564 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-sys\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803606 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803626 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-iscsi\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803665 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-lib-modules\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803690 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-logs\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803711 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-dev\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803725 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-nvme\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803744 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-httpd-run\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803759 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-var-locks-brick\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803792 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803823 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkrjb\" (UniqueName: \"kubernetes.io/projected/e4204740-8b38-44c3-b8cf-c8eb781c044a-kube-api-access-lkrjb\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803865 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-scripts\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803880 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-run\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.803962 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-config-data\") pod \"e4204740-8b38-44c3-b8cf-c8eb781c044a\" (UID: \"e4204740-8b38-44c3-b8cf-c8eb781c044a\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.804869 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.804930 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-sys" (OuterVolumeSpecName: "sys") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.806123 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.806190 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.806842 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-logs" (OuterVolumeSpecName: "logs") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.806903 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-dev" (OuterVolumeSpecName: "dev") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.806978 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.807188 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-run" (OuterVolumeSpecName: "run") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.807273 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.809825 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.810653 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-scripts" (OuterVolumeSpecName: "scripts") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.811074 4711 scope.go:117] "RemoveContainer" containerID="82b95992e011935e47f24709e1b2d8c7f9f82404005dd0917e07203a91416dca" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.814065 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.816176 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4204740-8b38-44c3-b8cf-c8eb781c044a-kube-api-access-lkrjb" (OuterVolumeSpecName: "kube-api-access-lkrjb") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "kube-api-access-lkrjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.847427 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-config-data" (OuterVolumeSpecName: "config-data") pod "e4204740-8b38-44c3-b8cf-c8eb781c044a" (UID: "e4204740-8b38-44c3-b8cf-c8eb781c044a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906170 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-logs\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906556 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-run\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906616 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-lib-modules\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906642 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-dev\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906640 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-run" (OuterVolumeSpecName: "run") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906670 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906670 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-logs" (OuterVolumeSpecName: "logs") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906703 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-dev" (OuterVolumeSpecName: "dev") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906734 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-var-locks-brick\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906747 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.906887 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907391 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-sys\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907435 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-sys" (OuterVolumeSpecName: "sys") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907590 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcr9w\" (UniqueName: \"kubernetes.io/projected/831817e3-651d-47a1-ae54-ddebe2dfc17b-kube-api-access-fcr9w\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907639 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-scripts\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907692 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907733 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-nvme\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907772 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-config-data\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907809 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-httpd-run\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907843 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.907856 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-iscsi\") pod \"831817e3-651d-47a1-ae54-ddebe2dfc17b\" (UID: \"831817e3-651d-47a1-ae54-ddebe2dfc17b\") " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908046 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908077 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908559 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908580 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908591 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908600 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4204740-8b38-44c3-b8cf-c8eb781c044a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908608 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908617 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908625 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831817e3-651d-47a1-ae54-ddebe2dfc17b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908633 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908658 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908668 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908677 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908685 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908693 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908702 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908711 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908720 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908729 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908739 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/831817e3-651d-47a1-ae54-ddebe2dfc17b-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908747 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908754 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4204740-8b38-44c3-b8cf-c8eb781c044a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908762 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e4204740-8b38-44c3-b8cf-c8eb781c044a-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908783 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.908793 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkrjb\" (UniqueName: \"kubernetes.io/projected/e4204740-8b38-44c3-b8cf-c8eb781c044a-kube-api-access-lkrjb\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.910078 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.910411 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.910507 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/831817e3-651d-47a1-ae54-ddebe2dfc17b-kube-api-access-fcr9w" (OuterVolumeSpecName: "kube-api-access-fcr9w") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "kube-api-access-fcr9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.911879 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-scripts" (OuterVolumeSpecName: "scripts") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.922189 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.923300 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 12:51:40 crc kubenswrapper[4711]: I1203 12:51:40.946091 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-config-data" (OuterVolumeSpecName: "config-data") pod "831817e3-651d-47a1-ae54-ddebe2dfc17b" (UID: "831817e3-651d-47a1-ae54-ddebe2dfc17b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.010620 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.010682 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.010738 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.010754 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.010766 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcr9w\" (UniqueName: \"kubernetes.io/projected/831817e3-651d-47a1-ae54-ddebe2dfc17b-kube-api-access-fcr9w\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.010778 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831817e3-651d-47a1-ae54-ddebe2dfc17b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.010823 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.027584 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.027656 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.063873 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.070310 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.100484 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:41 crc kubenswrapper[4711]: E1203 12:51:41.100802 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-httpd" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.100819 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-httpd" Dec 03 12:51:41 crc kubenswrapper[4711]: E1203 12:51:41.100838 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerName="glance-log" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.100844 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerName="glance-log" Dec 03 12:51:41 crc kubenswrapper[4711]: E1203 12:51:41.100868 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-log" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.100874 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-log" Dec 03 12:51:41 crc kubenswrapper[4711]: E1203 12:51:41.100884 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerName="glance-httpd" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.100889 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerName="glance-httpd" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.101040 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-httpd" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.101052 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerName="glance-httpd" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.101065 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" containerName="glance-log" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.101076 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" containerName="glance-log" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.101817 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.111293 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.122199 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.122246 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224004 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-dev\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224081 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224127 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224148 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnj7\" (UniqueName: \"kubernetes.io/projected/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-kube-api-access-plnj7\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224184 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224206 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-sys\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224220 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224245 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-logs\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224262 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-run\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224286 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224320 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224344 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224363 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.224388 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.326693 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-sys\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.326793 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.326853 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-logs\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.326890 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-run\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.326998 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327085 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327131 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327167 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327221 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327255 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-dev\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327310 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327380 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327425 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnj7\" (UniqueName: \"kubernetes.io/projected/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-kube-api-access-plnj7\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327471 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327511 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.326853 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-sys\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327809 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-dev\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327762 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327888 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.327996 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.328024 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.328079 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.328024 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-run\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.328401 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.328560 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-logs\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.331268 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.332843 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.357982 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnj7\" (UniqueName: \"kubernetes.io/projected/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-kube-api-access-plnj7\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.368430 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.371760 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.419840 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.746979 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.842779 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4204740-8b38-44c3-b8cf-c8eb781c044a" path="/var/lib/kubelet/pods/e4204740-8b38-44c3-b8cf-c8eb781c044a/volumes" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.845958 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.863010 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.888980 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.890462 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.903958 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:51:41 crc kubenswrapper[4711]: I1203 12:51:41.924474 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.047825 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.047869 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcsm9\" (UniqueName: \"kubernetes.io/projected/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-kube-api-access-mcsm9\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.047899 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.047948 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.047968 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-logs\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.047991 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.048125 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-sys\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.048198 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-dev\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.048293 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.048319 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.048402 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.048422 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.048488 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-run\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.048545 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.150935 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-sys\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.150987 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-dev\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151021 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151058 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151088 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151077 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-sys\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151106 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151151 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151210 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-dev\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151242 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-run\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151599 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151268 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-run\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151378 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151737 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.151893 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.155543 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.155664 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcsm9\" (UniqueName: \"kubernetes.io/projected/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-kube-api-access-mcsm9\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.155717 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.155777 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.155804 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-logs\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.155836 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.155882 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.156407 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-logs\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.156817 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.158090 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.165743 4711 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.166112 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.172396 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcsm9\" (UniqueName: \"kubernetes.io/projected/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-kube-api-access-mcsm9\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.177326 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.205600 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.236121 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.679751 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:51:42 crc kubenswrapper[4711]: W1203 12:51:42.683045 4711 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef1a870b_94d8_4c3b_a6ee_2f7ea748f37a.slice/crio-8764e9ccd990d8f718ff631420cc603b43830dd28eddc3351b3a0ad93af434ab WatchSource:0}: Error finding container 8764e9ccd990d8f718ff631420cc603b43830dd28eddc3351b3a0ad93af434ab: Status 404 returned error can't find the container with id 8764e9ccd990d8f718ff631420cc603b43830dd28eddc3351b3a0ad93af434ab Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.760619 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"94cf3b0f-ea64-4562-bf3c-95b2c4402c54","Type":"ContainerStarted","Data":"a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3"} Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.760662 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"94cf3b0f-ea64-4562-bf3c-95b2c4402c54","Type":"ContainerStarted","Data":"c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be"} Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.760672 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"94cf3b0f-ea64-4562-bf3c-95b2c4402c54","Type":"ContainerStarted","Data":"ecb98526c96b2ed8a4e745b6083cb10c21ff78a4ca4c4fe53bac0feea555df7f"} Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.762151 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a","Type":"ContainerStarted","Data":"8764e9ccd990d8f718ff631420cc603b43830dd28eddc3351b3a0ad93af434ab"} Dec 03 12:51:42 crc kubenswrapper[4711]: I1203 12:51:42.785743 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=1.785723447 podStartE2EDuration="1.785723447s" podCreationTimestamp="2025-12-03 12:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:42.782081778 +0000 UTC m=+2221.451333043" watchObservedRunningTime="2025-12-03 12:51:42.785723447 +0000 UTC m=+2221.454974702" Dec 03 12:51:43 crc kubenswrapper[4711]: I1203 12:51:43.770821 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a","Type":"ContainerStarted","Data":"5855c033bcca3b3d1012b8370d68a658e88455644f800e4158321558d1ff4e14"} Dec 03 12:51:43 crc kubenswrapper[4711]: I1203 12:51:43.771351 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a","Type":"ContainerStarted","Data":"99626dbe1e09ce1457eb7a93d22bad72e443d9d2e893b5ecbd41672cb89cc3c7"} Dec 03 12:51:43 crc kubenswrapper[4711]: I1203 12:51:43.794807 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.794789259 podStartE2EDuration="2.794789259s" podCreationTimestamp="2025-12-03 12:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:51:43.79410458 +0000 UTC m=+2222.463355855" watchObservedRunningTime="2025-12-03 12:51:43.794789259 +0000 UTC m=+2222.464040514" Dec 03 12:51:43 crc kubenswrapper[4711]: I1203 12:51:43.826316 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="831817e3-651d-47a1-ae54-ddebe2dfc17b" path="/var/lib/kubelet/pods/831817e3-651d-47a1-ae54-ddebe2dfc17b/volumes" Dec 03 12:51:51 crc kubenswrapper[4711]: I1203 12:51:51.420691 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:51 crc kubenswrapper[4711]: I1203 12:51:51.421431 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:51 crc kubenswrapper[4711]: I1203 12:51:51.468008 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:51 crc kubenswrapper[4711]: I1203 12:51:51.508043 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:51 crc kubenswrapper[4711]: I1203 12:51:51.852616 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:51 crc kubenswrapper[4711]: I1203 12:51:51.853291 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:52 crc kubenswrapper[4711]: I1203 12:51:52.237089 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:52 crc kubenswrapper[4711]: I1203 12:51:52.237158 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:52 crc kubenswrapper[4711]: I1203 12:51:52.284844 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:52 crc kubenswrapper[4711]: I1203 12:51:52.310115 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:52 crc kubenswrapper[4711]: I1203 12:51:52.860709 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:52 crc kubenswrapper[4711]: I1203 12:51:52.860749 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:53 crc kubenswrapper[4711]: I1203 12:51:53.869028 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:51:53 crc kubenswrapper[4711]: I1203 12:51:53.869067 4711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:51:53 crc kubenswrapper[4711]: I1203 12:51:53.950452 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:53 crc kubenswrapper[4711]: I1203 12:51:53.974799 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:51:54 crc kubenswrapper[4711]: I1203 12:51:54.777773 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:51:54 crc kubenswrapper[4711]: I1203 12:51:54.779453 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.401481 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.401993 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.402038 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.402636 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.402688 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" gracePeriod=600 Dec 03 12:52:05 crc kubenswrapper[4711]: E1203 12:52:05.535065 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.972726 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" exitCode=0 Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.972777 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3"} Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.972813 4711 scope.go:117] "RemoveContainer" containerID="bf5aaef3ea300423cb3fae0894d5ba5f74a380ce9c593a8a024e4a722f79eb6b" Dec 03 12:52:05 crc kubenswrapper[4711]: I1203 12:52:05.974501 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:52:05 crc kubenswrapper[4711]: E1203 12:52:05.975158 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:52:17 crc kubenswrapper[4711]: I1203 12:52:17.823651 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:52:17 crc kubenswrapper[4711]: E1203 12:52:17.825297 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:52:29 crc kubenswrapper[4711]: I1203 12:52:29.817495 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:52:29 crc kubenswrapper[4711]: E1203 12:52:29.818461 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:52:37 crc kubenswrapper[4711]: I1203 12:52:37.114935 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:52:37 crc kubenswrapper[4711]: I1203 12:52:37.115809 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" containerName="glance-log" containerID="cri-o://355f127ffd8448aceba29c3b05d09b59993e407776d176dd53889aba772c6a23" gracePeriod=30 Dec 03 12:52:37 crc kubenswrapper[4711]: I1203 12:52:37.116009 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" containerName="glance-httpd" containerID="cri-o://bdcfcd5dde4ffac8e4b1493dbbbfd60a6e12d59e046542812db8f50d8cf65d06" gracePeriod=30 Dec 03 12:52:37 crc kubenswrapper[4711]: I1203 12:52:37.254004 4711 generic.go:334] "Generic (PLEG): container finished" podID="75fbd2cd-656d-4547-bd68-2b679f418780" containerID="355f127ffd8448aceba29c3b05d09b59993e407776d176dd53889aba772c6a23" exitCode=143 Dec 03 12:52:37 crc kubenswrapper[4711]: I1203 12:52:37.254043 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"75fbd2cd-656d-4547-bd68-2b679f418780","Type":"ContainerDied","Data":"355f127ffd8448aceba29c3b05d09b59993e407776d176dd53889aba772c6a23"} Dec 03 12:52:37 crc kubenswrapper[4711]: I1203 12:52:37.269865 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:52:37 crc kubenswrapper[4711]: I1203 12:52:37.270116 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerName="glance-log" containerID="cri-o://b0ecd41e20e4650d351e85127f266cd54f5c082de6a97aea6cce50d32f2658ef" gracePeriod=30 Dec 03 12:52:37 crc kubenswrapper[4711]: I1203 12:52:37.270249 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerName="glance-httpd" containerID="cri-o://8f33533d136c3929fa2d671bdfa9d7a0ce52d02a4351886f14fa63c26f77a954" gracePeriod=30 Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.270447 4711 generic.go:334] "Generic (PLEG): container finished" podID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerID="b0ecd41e20e4650d351e85127f266cd54f5c082de6a97aea6cce50d32f2658ef" exitCode=143 Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.270551 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2256b30d-1e4e-410f-8b81-6e813009b36d","Type":"ContainerDied","Data":"b0ecd41e20e4650d351e85127f266cd54f5c082de6a97aea6cce50d32f2658ef"} Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.467029 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dn7gg"] Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.475373 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dn7gg"] Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.509514 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance3235-account-delete-nrtm4"] Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.510612 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.528648 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3235-account-delete-nrtm4"] Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.562406 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.562669 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-log" containerID="cri-o://99626dbe1e09ce1457eb7a93d22bad72e443d9d2e893b5ecbd41672cb89cc3c7" gracePeriod=30 Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.562746 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-httpd" containerID="cri-o://5855c033bcca3b3d1012b8370d68a658e88455644f800e4158321558d1ff4e14" gracePeriod=30 Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.620857 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlhgw\" (UniqueName: \"kubernetes.io/projected/b3cf8de0-7fe4-4754-9827-f8b793104551-kube-api-access-vlhgw\") pod \"glance3235-account-delete-nrtm4\" (UID: \"b3cf8de0-7fe4-4754-9827-f8b793104551\") " pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.621318 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3cf8de0-7fe4-4754-9827-f8b793104551-operator-scripts\") pod \"glance3235-account-delete-nrtm4\" (UID: \"b3cf8de0-7fe4-4754-9827-f8b793104551\") " pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.661319 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.661736 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-log" containerID="cri-o://c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be" gracePeriod=30 Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.661831 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-httpd" containerID="cri-o://a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3" gracePeriod=30 Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.722628 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlhgw\" (UniqueName: \"kubernetes.io/projected/b3cf8de0-7fe4-4754-9827-f8b793104551-kube-api-access-vlhgw\") pod \"glance3235-account-delete-nrtm4\" (UID: \"b3cf8de0-7fe4-4754-9827-f8b793104551\") " pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.722781 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3cf8de0-7fe4-4754-9827-f8b793104551-operator-scripts\") pod \"glance3235-account-delete-nrtm4\" (UID: \"b3cf8de0-7fe4-4754-9827-f8b793104551\") " pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.723495 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3cf8de0-7fe4-4754-9827-f8b793104551-operator-scripts\") pod \"glance3235-account-delete-nrtm4\" (UID: \"b3cf8de0-7fe4-4754-9827-f8b793104551\") " pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.744544 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlhgw\" (UniqueName: \"kubernetes.io/projected/b3cf8de0-7fe4-4754-9827-f8b793104551-kube-api-access-vlhgw\") pod \"glance3235-account-delete-nrtm4\" (UID: \"b3cf8de0-7fe4-4754-9827-f8b793104551\") " pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:38 crc kubenswrapper[4711]: I1203 12:52:38.833846 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:39 crc kubenswrapper[4711]: I1203 12:52:39.044746 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3235-account-delete-nrtm4"] Dec 03 12:52:39 crc kubenswrapper[4711]: I1203 12:52:39.280653 4711 generic.go:334] "Generic (PLEG): container finished" podID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerID="c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be" exitCode=143 Dec 03 12:52:39 crc kubenswrapper[4711]: I1203 12:52:39.280724 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"94cf3b0f-ea64-4562-bf3c-95b2c4402c54","Type":"ContainerDied","Data":"c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be"} Dec 03 12:52:39 crc kubenswrapper[4711]: I1203 12:52:39.282009 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" event={"ID":"b3cf8de0-7fe4-4754-9827-f8b793104551","Type":"ContainerStarted","Data":"f7de3240d0d84284c868389b052511cc57e6fff6a6d4ff4144ecd47aca58b96b"} Dec 03 12:52:39 crc kubenswrapper[4711]: I1203 12:52:39.282026 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" event={"ID":"b3cf8de0-7fe4-4754-9827-f8b793104551","Type":"ContainerStarted","Data":"72ac2cbaa4e18ca9a9aa301cc1341780b6c4d85ee6f01b365aa193334bafdd49"} Dec 03 12:52:39 crc kubenswrapper[4711]: I1203 12:52:39.284509 4711 generic.go:334] "Generic (PLEG): container finished" podID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerID="99626dbe1e09ce1457eb7a93d22bad72e443d9d2e893b5ecbd41672cb89cc3c7" exitCode=143 Dec 03 12:52:39 crc kubenswrapper[4711]: I1203 12:52:39.284538 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a","Type":"ContainerDied","Data":"99626dbe1e09ce1457eb7a93d22bad72e443d9d2e893b5ecbd41672cb89cc3c7"} Dec 03 12:52:39 crc kubenswrapper[4711]: I1203 12:52:39.827776 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a532310-9c14-45b6-8d03-ae94aa7ecd77" path="/var/lib/kubelet/pods/8a532310-9c14-45b6-8d03-ae94aa7ecd77/volumes" Dec 03 12:52:40 crc kubenswrapper[4711]: I1203 12:52:40.300743 4711 generic.go:334] "Generic (PLEG): container finished" podID="b3cf8de0-7fe4-4754-9827-f8b793104551" containerID="f7de3240d0d84284c868389b052511cc57e6fff6a6d4ff4144ecd47aca58b96b" exitCode=0 Dec 03 12:52:40 crc kubenswrapper[4711]: I1203 12:52:40.300783 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" event={"ID":"b3cf8de0-7fe4-4754-9827-f8b793104551","Type":"ContainerDied","Data":"f7de3240d0d84284c868389b052511cc57e6fff6a6d4ff4144ecd47aca58b96b"} Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.313421 4711 generic.go:334] "Generic (PLEG): container finished" podID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerID="8f33533d136c3929fa2d671bdfa9d7a0ce52d02a4351886f14fa63c26f77a954" exitCode=0 Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.313489 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2256b30d-1e4e-410f-8b81-6e813009b36d","Type":"ContainerDied","Data":"8f33533d136c3929fa2d671bdfa9d7a0ce52d02a4351886f14fa63c26f77a954"} Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.316228 4711 generic.go:334] "Generic (PLEG): container finished" podID="75fbd2cd-656d-4547-bd68-2b679f418780" containerID="bdcfcd5dde4ffac8e4b1493dbbbfd60a6e12d59e046542812db8f50d8cf65d06" exitCode=0 Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.316365 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"75fbd2cd-656d-4547-bd68-2b679f418780","Type":"ContainerDied","Data":"bdcfcd5dde4ffac8e4b1493dbbbfd60a6e12d59e046542812db8f50d8cf65d06"} Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.316449 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"75fbd2cd-656d-4547-bd68-2b679f418780","Type":"ContainerDied","Data":"293a39ddd131945d7813aa8f8bbd4156e5e053db87f1e93de0a3328868ab1b0d"} Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.316483 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="293a39ddd131945d7813aa8f8bbd4156e5e053db87f1e93de0a3328868ab1b0d" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.341854 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.469716 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-httpd-run\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.469826 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.469882 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-sys\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.469983 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-lib-modules\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470028 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-dev\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470063 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-config-data\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470093 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470133 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-nvme\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470194 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-var-locks-brick\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470211 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470254 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-scripts\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470273 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-dev" (OuterVolumeSpecName: "dev") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470330 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-logs\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470428 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-run\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470462 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-iscsi\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470500 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt2dz\" (UniqueName: \"kubernetes.io/projected/75fbd2cd-656d-4547-bd68-2b679f418780-kube-api-access-zt2dz\") pod \"75fbd2cd-656d-4547-bd68-2b679f418780\" (UID: \"75fbd2cd-656d-4547-bd68-2b679f418780\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.470576 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471128 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471161 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471178 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471367 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471442 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-run" (OuterVolumeSpecName: "run") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471468 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471491 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471508 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-sys" (OuterVolumeSpecName: "sys") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.471640 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-logs" (OuterVolumeSpecName: "logs") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.476611 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.477267 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-scripts" (OuterVolumeSpecName: "scripts") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.477545 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75fbd2cd-656d-4547-bd68-2b679f418780-kube-api-access-zt2dz" (OuterVolumeSpecName: "kube-api-access-zt2dz") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "kube-api-access-zt2dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.478205 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.519455 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-config-data" (OuterVolumeSpecName: "config-data") pod "75fbd2cd-656d-4547-bd68-2b679f418780" (UID: "75fbd2cd-656d-4547-bd68-2b679f418780"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572612 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572655 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572688 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt2dz\" (UniqueName: \"kubernetes.io/projected/75fbd2cd-656d-4547-bd68-2b679f418780-kube-api-access-zt2dz\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572726 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572736 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572746 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572796 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572825 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572854 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/75fbd2cd-656d-4547-bd68-2b679f418780-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572864 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75fbd2cd-656d-4547-bd68-2b679f418780-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.572875 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75fbd2cd-656d-4547-bd68-2b679f418780-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.583865 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.587657 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.592571 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.673714 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlhgw\" (UniqueName: \"kubernetes.io/projected/b3cf8de0-7fe4-4754-9827-f8b793104551-kube-api-access-vlhgw\") pod \"b3cf8de0-7fe4-4754-9827-f8b793104551\" (UID: \"b3cf8de0-7fe4-4754-9827-f8b793104551\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.673901 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3cf8de0-7fe4-4754-9827-f8b793104551-operator-scripts\") pod \"b3cf8de0-7fe4-4754-9827-f8b793104551\" (UID: \"b3cf8de0-7fe4-4754-9827-f8b793104551\") " Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.674265 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.674288 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.674352 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3cf8de0-7fe4-4754-9827-f8b793104551-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b3cf8de0-7fe4-4754-9827-f8b793104551" (UID: "b3cf8de0-7fe4-4754-9827-f8b793104551"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.683097 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3cf8de0-7fe4-4754-9827-f8b793104551-kube-api-access-vlhgw" (OuterVolumeSpecName: "kube-api-access-vlhgw") pod "b3cf8de0-7fe4-4754-9827-f8b793104551" (UID: "b3cf8de0-7fe4-4754-9827-f8b793104551"). InnerVolumeSpecName "kube-api-access-vlhgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.775740 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlhgw\" (UniqueName: \"kubernetes.io/projected/b3cf8de0-7fe4-4754-9827-f8b793104551-kube-api-access-vlhgw\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.775798 4711 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3cf8de0-7fe4-4754-9827-f8b793104551-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.815568 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:53428->10.217.0.149:9292: read: connection reset by peer" Dec 03 12:52:41 crc kubenswrapper[4711]: I1203 12:52:41.815587 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:53442->10.217.0.149:9292: read: connection reset by peer" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.238827 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.150:9292/healthcheck\": dial tcp 10.217.0.150:9292: connect: connection refused" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.240092 4711 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.150:9292/healthcheck\": dial tcp 10.217.0.150:9292: connect: connection refused" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.249865 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.325940 4711 generic.go:334] "Generic (PLEG): container finished" podID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerID="5855c033bcca3b3d1012b8370d68a658e88455644f800e4158321558d1ff4e14" exitCode=0 Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.326006 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a","Type":"ContainerDied","Data":"5855c033bcca3b3d1012b8370d68a658e88455644f800e4158321558d1ff4e14"} Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.327810 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2256b30d-1e4e-410f-8b81-6e813009b36d","Type":"ContainerDied","Data":"03d8eef9aa5b3766ee2458cdb00c7271e109f6ffe7638b3b41a77f35dedb6d95"} Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.327838 4711 scope.go:117] "RemoveContainer" containerID="8f33533d136c3929fa2d671bdfa9d7a0ce52d02a4351886f14fa63c26f77a954" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.328044 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.333157 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.333174 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.333145 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3235-account-delete-nrtm4" event={"ID":"b3cf8de0-7fe4-4754-9827-f8b793104551","Type":"ContainerDied","Data":"72ac2cbaa4e18ca9a9aa301cc1341780b6c4d85ee6f01b365aa193334bafdd49"} Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.333291 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72ac2cbaa4e18ca9a9aa301cc1341780b6c4d85ee6f01b365aa193334bafdd49" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.368813 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.371255 4711 scope.go:117] "RemoveContainer" containerID="b0ecd41e20e4650d351e85127f266cd54f5c082de6a97aea6cce50d32f2658ef" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.377378 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386522 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2hmm\" (UniqueName: \"kubernetes.io/projected/2256b30d-1e4e-410f-8b81-6e813009b36d-kube-api-access-v2hmm\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386595 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-lib-modules\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386659 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386681 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386709 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-scripts\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386741 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-nvme\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386788 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386827 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-config-data\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386858 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-var-locks-brick\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386889 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-iscsi\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386931 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-run\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.386977 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-dev\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387023 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-httpd-run\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387051 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-sys\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387076 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-logs\") pod \"2256b30d-1e4e-410f-8b81-6e813009b36d\" (UID: \"2256b30d-1e4e-410f-8b81-6e813009b36d\") " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387314 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387368 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-dev" (OuterVolumeSpecName: "dev") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387416 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387740 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-logs" (OuterVolumeSpecName: "logs") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387788 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387812 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387830 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387847 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.387940 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-run" (OuterVolumeSpecName: "run") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.388045 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.388090 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-sys" (OuterVolumeSpecName: "sys") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.388118 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.390149 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.390375 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2256b30d-1e4e-410f-8b81-6e813009b36d-kube-api-access-v2hmm" (OuterVolumeSpecName: "kube-api-access-v2hmm") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "kube-api-access-v2hmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.391046 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-scripts" (OuterVolumeSpecName: "scripts") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.392550 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.428393 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-config-data" (OuterVolumeSpecName: "config-data") pod "2256b30d-1e4e-410f-8b81-6e813009b36d" (UID: "2256b30d-1e4e-410f-8b81-6e813009b36d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.490792 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491102 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491135 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2256b30d-1e4e-410f-8b81-6e813009b36d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491160 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491186 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491210 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491233 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2256b30d-1e4e-410f-8b81-6e813009b36d-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491256 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2256b30d-1e4e-410f-8b81-6e813009b36d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491280 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2hmm\" (UniqueName: \"kubernetes.io/projected/2256b30d-1e4e-410f-8b81-6e813009b36d-kube-api-access-v2hmm\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.491335 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.512544 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.520521 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.594620 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.594669 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.659893 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:52:42 crc kubenswrapper[4711]: I1203 12:52:42.664786 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.037941 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104265 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-logs\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104370 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcsm9\" (UniqueName: \"kubernetes.io/projected/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-kube-api-access-mcsm9\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104431 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-lib-modules\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104470 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-sys\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104488 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104512 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-var-locks-brick\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104552 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-iscsi\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104604 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-run\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104627 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-config-data\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104652 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-dev\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104685 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-nvme\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104717 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-scripts\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104748 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.104808 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-httpd-run\") pod \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\" (UID: \"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.105405 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.105764 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-logs" (OuterVolumeSpecName: "logs") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.106544 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-run" (OuterVolumeSpecName: "run") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.106586 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.106610 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.106633 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.106663 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.106762 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-sys" (OuterVolumeSpecName: "sys") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.106765 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-dev" (OuterVolumeSpecName: "dev") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.111439 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.111584 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.113125 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-scripts" (OuterVolumeSpecName: "scripts") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.126071 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-kube-api-access-mcsm9" (OuterVolumeSpecName: "kube-api-access-mcsm9") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "kube-api-access-mcsm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.170786 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-config-data" (OuterVolumeSpecName: "config-data") pod "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" (UID: "ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206165 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206201 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206230 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206240 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206251 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206294 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206305 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206314 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206322 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcsm9\" (UniqueName: \"kubernetes.io/projected/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-kube-api-access-mcsm9\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206331 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206341 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206355 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206366 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.206376 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.221266 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.224536 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.228967 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312153 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-nvme\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312508 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312538 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-iscsi\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312595 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312605 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312639 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-var-locks-brick\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312687 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-scripts\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312711 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-lib-modules\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312756 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-logs\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312790 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-dev\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312840 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-run\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312860 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plnj7\" (UniqueName: \"kubernetes.io/projected/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-kube-api-access-plnj7\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312890 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-sys\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312948 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312964 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-httpd-run\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312983 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.312986 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313008 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-config-data\") pod \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\" (UID: \"94cf3b0f-ea64-4562-bf3c-95b2c4402c54\") " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313309 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-sys" (OuterVolumeSpecName: "sys") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313349 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-dev" (OuterVolumeSpecName: "dev") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313382 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-run" (OuterVolumeSpecName: "run") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313498 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313575 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-logs" (OuterVolumeSpecName: "logs") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313651 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313668 4711 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313677 4711 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-dev\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313685 4711 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313693 4711 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-sys\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313701 4711 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313711 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313718 4711 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313726 4711 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313734 4711 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.313743 4711 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.315370 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.315739 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-scripts" (OuterVolumeSpecName: "scripts") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.316027 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.316236 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-kube-api-access-plnj7" (OuterVolumeSpecName: "kube-api-access-plnj7") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "kube-api-access-plnj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.346581 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-config-data" (OuterVolumeSpecName: "config-data") pod "94cf3b0f-ea64-4562-bf3c-95b2c4402c54" (UID: "94cf3b0f-ea64-4562-bf3c-95b2c4402c54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.347764 4711 generic.go:334] "Generic (PLEG): container finished" podID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerID="a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3" exitCode=0 Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.347839 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"94cf3b0f-ea64-4562-bf3c-95b2c4402c54","Type":"ContainerDied","Data":"a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3"} Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.347849 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.347871 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"94cf3b0f-ea64-4562-bf3c-95b2c4402c54","Type":"ContainerDied","Data":"ecb98526c96b2ed8a4e745b6083cb10c21ff78a4ca4c4fe53bac0feea555df7f"} Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.347891 4711 scope.go:117] "RemoveContainer" containerID="a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.349814 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a","Type":"ContainerDied","Data":"8764e9ccd990d8f718ff631420cc603b43830dd28eddc3351b3a0ad93af434ab"} Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.349924 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.417833 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.417878 4711 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.418027 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plnj7\" (UniqueName: \"kubernetes.io/projected/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-kube-api-access-plnj7\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.418049 4711 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.418081 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94cf3b0f-ea64-4562-bf3c-95b2c4402c54-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.422421 4711 scope.go:117] "RemoveContainer" containerID="c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.429573 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.449822 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.452718 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.457624 4711 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.463923 4711 scope.go:117] "RemoveContainer" containerID="a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3" Dec 03 12:52:43 crc kubenswrapper[4711]: E1203 12:52:43.464387 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3\": container with ID starting with a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3 not found: ID does not exist" containerID="a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.464421 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3"} err="failed to get container status \"a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3\": rpc error: code = NotFound desc = could not find container \"a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3\": container with ID starting with a19edfe490c0562914329016e9c3d76d4e232339015846700f7478a42a3bbce3 not found: ID does not exist" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.464449 4711 scope.go:117] "RemoveContainer" containerID="c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be" Dec 03 12:52:43 crc kubenswrapper[4711]: E1203 12:52:43.464981 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be\": container with ID starting with c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be not found: ID does not exist" containerID="c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.465007 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be"} err="failed to get container status \"c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be\": rpc error: code = NotFound desc = could not find container \"c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be\": container with ID starting with c10260793c20c22b78be35d1f9ed04542eaa5fa1fcc16b1fc920b1bd15c3a7be not found: ID does not exist" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.465027 4711 scope.go:117] "RemoveContainer" containerID="5855c033bcca3b3d1012b8370d68a658e88455644f800e4158321558d1ff4e14" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.470204 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.477099 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.486413 4711 scope.go:117] "RemoveContainer" containerID="99626dbe1e09ce1457eb7a93d22bad72e443d9d2e893b5ecbd41672cb89cc3c7" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.522776 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.522818 4711 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.527061 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-k4hnl"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.536140 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-k4hnl"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.540634 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance3235-account-delete-nrtm4"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.545473 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-3235-account-create-update-fcx9z"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.550578 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance3235-account-delete-nrtm4"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.555393 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-3235-account-create-update-fcx9z"] Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.826113 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" path="/var/lib/kubelet/pods/2256b30d-1e4e-410f-8b81-6e813009b36d/volumes" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.826784 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e8985d-2ba9-413c-a0dd-860084f4fac5" path="/var/lib/kubelet/pods/31e8985d-2ba9-413c-a0dd-860084f4fac5/volumes" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.827405 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" path="/var/lib/kubelet/pods/75fbd2cd-656d-4547-bd68-2b679f418780/volumes" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.828498 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" path="/var/lib/kubelet/pods/94cf3b0f-ea64-4562-bf3c-95b2c4402c54/volumes" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.829055 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3cf8de0-7fe4-4754-9827-f8b793104551" path="/var/lib/kubelet/pods/b3cf8de0-7fe4-4754-9827-f8b793104551/volumes" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.829526 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28c016a-8dd3-4a25-9d3e-a25bdcc511bd" path="/var/lib/kubelet/pods/c28c016a-8dd3-4a25-9d3e-a25bdcc511bd/volumes" Dec 03 12:52:43 crc kubenswrapper[4711]: I1203 12:52:43.830462 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" path="/var/lib/kubelet/pods/ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a/volumes" Dec 03 12:52:44 crc kubenswrapper[4711]: I1203 12:52:44.817714 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:52:44 crc kubenswrapper[4711]: E1203 12:52:44.818007 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:52:47 crc kubenswrapper[4711]: I1203 12:52:47.036820 4711 scope.go:117] "RemoveContainer" containerID="ee9a67439fed18de902c8202f3607e93d9973fe87e7a1974fd509c9c3288ace8" Dec 03 12:52:47 crc kubenswrapper[4711]: I1203 12:52:47.062668 4711 scope.go:117] "RemoveContainer" containerID="8dfd6b14f1478b539768c44ce6e52d71d7a2141500806e78386bbbae36504b00" Dec 03 12:52:47 crc kubenswrapper[4711]: I1203 12:52:47.141200 4711 scope.go:117] "RemoveContainer" containerID="3d572da822364ebd58f03c0cc3751a0d65b14cbad06f812de1f0098afe72fa4f" Dec 03 12:52:57 crc kubenswrapper[4711]: I1203 12:52:57.817952 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:52:57 crc kubenswrapper[4711]: E1203 12:52:57.818702 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.451253 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l2hvt/must-gather-8k6vl"] Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452163 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452182 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452193 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452203 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452218 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452229 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452245 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452255 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452272 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452281 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452308 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3cf8de0-7fe4-4754-9827-f8b793104551" containerName="mariadb-account-delete" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452319 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cf8de0-7fe4-4754-9827-f8b793104551" containerName="mariadb-account-delete" Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452342 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452352 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452371 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452381 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: E1203 12:53:09.452393 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452402 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452576 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452600 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cf3b0f-ea64-4562-bf3c-95b2c4402c54" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452613 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452631 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1a870b-94d8-4c3b-a6ee-2f7ea748f37a" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452644 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452660 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452677 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="2256b30d-1e4e-410f-8b81-6e813009b36d" containerName="glance-log" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452692 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3cf8de0-7fe4-4754-9827-f8b793104551" containerName="mariadb-account-delete" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.452709 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="75fbd2cd-656d-4547-bd68-2b679f418780" containerName="glance-httpd" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.454064 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.463655 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-l2hvt"/"default-dockercfg-rcv5j" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.463666 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l2hvt"/"openshift-service-ca.crt" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.463983 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l2hvt"/"kube-root-ca.crt" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.464023 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l2hvt/must-gather-8k6vl"] Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.556379 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmhzc\" (UniqueName: \"kubernetes.io/projected/cabb321f-81f8-4dad-b86f-5feead1e7c71-kube-api-access-wmhzc\") pod \"must-gather-8k6vl\" (UID: \"cabb321f-81f8-4dad-b86f-5feead1e7c71\") " pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.556700 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cabb321f-81f8-4dad-b86f-5feead1e7c71-must-gather-output\") pod \"must-gather-8k6vl\" (UID: \"cabb321f-81f8-4dad-b86f-5feead1e7c71\") " pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.657515 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmhzc\" (UniqueName: \"kubernetes.io/projected/cabb321f-81f8-4dad-b86f-5feead1e7c71-kube-api-access-wmhzc\") pod \"must-gather-8k6vl\" (UID: \"cabb321f-81f8-4dad-b86f-5feead1e7c71\") " pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.657571 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cabb321f-81f8-4dad-b86f-5feead1e7c71-must-gather-output\") pod \"must-gather-8k6vl\" (UID: \"cabb321f-81f8-4dad-b86f-5feead1e7c71\") " pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.658044 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cabb321f-81f8-4dad-b86f-5feead1e7c71-must-gather-output\") pod \"must-gather-8k6vl\" (UID: \"cabb321f-81f8-4dad-b86f-5feead1e7c71\") " pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.689970 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmhzc\" (UniqueName: \"kubernetes.io/projected/cabb321f-81f8-4dad-b86f-5feead1e7c71-kube-api-access-wmhzc\") pod \"must-gather-8k6vl\" (UID: \"cabb321f-81f8-4dad-b86f-5feead1e7c71\") " pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.778834 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:53:09 crc kubenswrapper[4711]: I1203 12:53:09.996896 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l2hvt/must-gather-8k6vl"] Dec 03 12:53:10 crc kubenswrapper[4711]: I1203 12:53:10.601192 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" event={"ID":"cabb321f-81f8-4dad-b86f-5feead1e7c71","Type":"ContainerStarted","Data":"2e3cee718440b5d1f80fd083cd0c00b0c1d8c37796d33aab7911c5b3595f4258"} Dec 03 12:53:10 crc kubenswrapper[4711]: I1203 12:53:10.817159 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:53:10 crc kubenswrapper[4711]: E1203 12:53:10.817375 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:53:14 crc kubenswrapper[4711]: I1203 12:53:14.639818 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" event={"ID":"cabb321f-81f8-4dad-b86f-5feead1e7c71","Type":"ContainerStarted","Data":"f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572"} Dec 03 12:53:14 crc kubenswrapper[4711]: I1203 12:53:14.640443 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" event={"ID":"cabb321f-81f8-4dad-b86f-5feead1e7c71","Type":"ContainerStarted","Data":"6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682"} Dec 03 12:53:14 crc kubenswrapper[4711]: I1203 12:53:14.656617 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" podStartSLOduration=1.7078513320000002 podStartE2EDuration="5.656595557s" podCreationTimestamp="2025-12-03 12:53:09 +0000 UTC" firstStartedPulling="2025-12-03 12:53:10.002832494 +0000 UTC m=+2308.672083749" lastFinishedPulling="2025-12-03 12:53:13.951576689 +0000 UTC m=+2312.620827974" observedRunningTime="2025-12-03 12:53:14.655305762 +0000 UTC m=+2313.324557027" watchObservedRunningTime="2025-12-03 12:53:14.656595557 +0000 UTC m=+2313.325846812" Dec 03 12:53:22 crc kubenswrapper[4711]: I1203 12:53:22.817657 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:53:22 crc kubenswrapper[4711]: E1203 12:53:22.818456 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:53:34 crc kubenswrapper[4711]: I1203 12:53:34.816605 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:53:34 crc kubenswrapper[4711]: E1203 12:53:34.817268 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:53:37 crc kubenswrapper[4711]: I1203 12:53:37.049364 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-2462-account-create-update-spvfj"] Dec 03 12:53:37 crc kubenswrapper[4711]: I1203 12:53:37.065985 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-xmfcd"] Dec 03 12:53:37 crc kubenswrapper[4711]: I1203 12:53:37.078216 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-xmfcd"] Dec 03 12:53:37 crc kubenswrapper[4711]: I1203 12:53:37.088770 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-2462-account-create-update-spvfj"] Dec 03 12:53:37 crc kubenswrapper[4711]: I1203 12:53:37.826763 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2112d2ef-102c-4a01-8124-6b17a34f50a5" path="/var/lib/kubelet/pods/2112d2ef-102c-4a01-8124-6b17a34f50a5/volumes" Dec 03 12:53:37 crc kubenswrapper[4711]: I1203 12:53:37.827392 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb08f39-558e-497e-a3dc-f01b252384e4" path="/var/lib/kubelet/pods/3eb08f39-558e-497e-a3dc-f01b252384e4/volumes" Dec 03 12:53:45 crc kubenswrapper[4711]: I1203 12:53:45.817459 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:53:45 crc kubenswrapper[4711]: E1203 12:53:45.818245 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:53:47 crc kubenswrapper[4711]: I1203 12:53:47.268076 4711 scope.go:117] "RemoveContainer" containerID="ba23666c2a5b4741700eedaa5fdc01b243680e994a0f699f8432cccdca8ea50f" Dec 03 12:53:47 crc kubenswrapper[4711]: I1203 12:53:47.304211 4711 scope.go:117] "RemoveContainer" containerID="fd11cddbff4440c60d9f8f3dac30f6f6f5d1df4ab243dbf34604304c43621b60" Dec 03 12:53:47 crc kubenswrapper[4711]: I1203 12:53:47.331571 4711 scope.go:117] "RemoveContainer" containerID="0aa6cfcc7cb1bcc1eddab83e201e348e86f7e7300c7dda744d07e6a23a153a2e" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.274447 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd_6cf0822e-a898-4a5a-9cdd-a10e0a56f468/util/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.448400 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd_6cf0822e-a898-4a5a-9cdd-a10e0a56f468/pull/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.473965 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd_6cf0822e-a898-4a5a-9cdd-a10e0a56f468/util/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.490731 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd_6cf0822e-a898-4a5a-9cdd-a10e0a56f468/pull/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.629224 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd_6cf0822e-a898-4a5a-9cdd-a10e0a56f468/util/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.629856 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd_6cf0822e-a898-4a5a-9cdd-a10e0a56f468/pull/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.702937 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121dvtkd_6cf0822e-a898-4a5a-9cdd-a10e0a56f468/extract/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.801534 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d_d27600e8-c4f5-416a-8138-51c87928cbdf/util/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.929292 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d_d27600e8-c4f5-416a-8138-51c87928cbdf/util/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.950399 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d_d27600e8-c4f5-416a-8138-51c87928cbdf/pull/0.log" Dec 03 12:53:48 crc kubenswrapper[4711]: I1203 12:53:48.984281 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d_d27600e8-c4f5-416a-8138-51c87928cbdf/pull/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.133686 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d_d27600e8-c4f5-416a-8138-51c87928cbdf/pull/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.154856 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d_d27600e8-c4f5-416a-8138-51c87928cbdf/extract/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.157395 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55a3886cc1ed42812df4eab61c7a6033dc924d195539e8545c8f175f616zc9d_d27600e8-c4f5-416a-8138-51c87928cbdf/util/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.303839 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz_c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3/util/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.469453 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz_c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3/util/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.504120 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz_c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3/pull/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.524096 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz_c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3/pull/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.651755 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz_c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3/pull/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.706635 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz_c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3/util/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.719630 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dnjspz_c9b765e7-1caf-4b36-98c9-8a5d5c0b88b3/extract/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.830926 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz_2b104ca1-f49a-4127-8a94-e74b0834307e/util/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.987481 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz_2b104ca1-f49a-4127-8a94-e74b0834307e/util/0.log" Dec 03 12:53:49 crc kubenswrapper[4711]: I1203 12:53:49.999169 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz_2b104ca1-f49a-4127-8a94-e74b0834307e/pull/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.001563 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz_2b104ca1-f49a-4127-8a94-e74b0834307e/pull/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.208108 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz_2b104ca1-f49a-4127-8a94-e74b0834307e/util/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.223244 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz_2b104ca1-f49a-4127-8a94-e74b0834307e/pull/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.256973 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_75cd1b38b3b1aa229d898b2e7287b3342b7a32892ab051f36f12919f65l2sqz_2b104ca1-f49a-4127-8a94-e74b0834307e/extract/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.380416 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2_84c113ac-ae1d-427b-892c-64fca086fb54/util/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.522982 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2_84c113ac-ae1d-427b-892c-64fca086fb54/util/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.559867 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2_84c113ac-ae1d-427b-892c-64fca086fb54/pull/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.571765 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2_84c113ac-ae1d-427b-892c-64fca086fb54/pull/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.720423 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2_84c113ac-ae1d-427b-892c-64fca086fb54/util/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.727538 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2_84c113ac-ae1d-427b-892c-64fca086fb54/pull/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.749163 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590n4jj2_84c113ac-ae1d-427b-892c-64fca086fb54/extract/0.log" Dec 03 12:53:50 crc kubenswrapper[4711]: I1203 12:53:50.913933 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82_3bcd3a17-8baf-4419-89f9-ae37d4f14176/util/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.080051 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82_3bcd3a17-8baf-4419-89f9-ae37d4f14176/util/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.080816 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82_3bcd3a17-8baf-4419-89f9-ae37d4f14176/pull/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.117614 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82_3bcd3a17-8baf-4419-89f9-ae37d4f14176/pull/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.239774 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82_3bcd3a17-8baf-4419-89f9-ae37d4f14176/util/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.274600 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82_3bcd3a17-8baf-4419-89f9-ae37d4f14176/extract/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.274694 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368chmr82_3bcd3a17-8baf-4419-89f9-ae37d4f14176/pull/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.334439 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq_1b5c69ae-9e6e-431f-adf9-6f6b40e6254b/util/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.601936 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq_1b5c69ae-9e6e-431f-adf9-6f6b40e6254b/pull/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.634190 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq_1b5c69ae-9e6e-431f-adf9-6f6b40e6254b/pull/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.637070 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq_1b5c69ae-9e6e-431f-adf9-6f6b40e6254b/util/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.798771 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq_1b5c69ae-9e6e-431f-adf9-6f6b40e6254b/util/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.817393 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq_1b5c69ae-9e6e-431f-adf9-6f6b40e6254b/extract/0.log" Dec 03 12:53:51 crc kubenswrapper[4711]: I1203 12:53:51.860806 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb929510237420951f9aa466df77004deff2bfef9fd1abb9ea842fc3ed8t8mq_1b5c69ae-9e6e-431f-adf9-6f6b40e6254b/pull/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.042920 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-f5bd89d87-dh9xv_885ab9bd-dfa6-4245-a8ac-126a9911538a/manager/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.055035 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k5dbg"] Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.062882 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k5dbg"] Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.084379 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-index-82qq5_b55ffe9c-6afa-4506-b385-46bc81594479/registry-server/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.115717 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7b5746d556-rrj48_7a329276-2ff5-41a6-90a7-5237ce641101/manager/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.284725 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-56b7d7797c-sljsp_07432873-5f89-4207-bc9f-d93994d12733/kube-rbac-proxy/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.308901 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-pvgpj_174cadcb-4175-478d-a8d1-0614f02a4cc2/registry-server/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.353592 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-56b7d7797c-sljsp_07432873-5f89-4207-bc9f-d93994d12733/manager/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.498751 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-rmc57_b7bb5832-bc73-4090-ac27-fb8809668c72/registry-server/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.562625 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55bc77bd75-m9gpn_8a3b9c74-bb7e-48cb-926d-5604a3bdb65c/manager/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.696329 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-26962_6c07e470-357e-4332-a4f2-e713b2a9e485/registry-server/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.717474 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-69f7cc4dcf-wgllq_49682020-cbd0-4b76-941d-2e0ae637db0b/manager/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.838930 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-5qvn8_a5448fb5-04ac-4514-b7f3-39e3c17a10cb/registry-server/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.933542 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-m5pkk_aa0a2065-4d7f-45a4-a13b-f7eb60da44fd/operator/0.log" Dec 03 12:53:52 crc kubenswrapper[4711]: I1203 12:53:52.989271 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-x9l2v_c3f55cbd-074a-4b43-a557-d6c5ce1ada0d/registry-server/0.log" Dec 03 12:53:53 crc kubenswrapper[4711]: I1203 12:53:53.090596 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7fbd454467-l5gnh_6a18ab91-ac21-48e9-804b-afe295a73e9c/manager/0.log" Dec 03 12:53:53 crc kubenswrapper[4711]: I1203 12:53:53.136844 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-rq4cg_d23e5068-0a2c-40ac-bc4a-fb8a69961f3f/registry-server/0.log" Dec 03 12:53:53 crc kubenswrapper[4711]: I1203 12:53:53.826084 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3afe72-bd28-46b4-9956-be4f9727c405" path="/var/lib/kubelet/pods/6f3afe72-bd28-46b4-9956-be4f9727c405/volumes" Dec 03 12:53:56 crc kubenswrapper[4711]: I1203 12:53:56.816896 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:53:56 crc kubenswrapper[4711]: E1203 12:53:56.817346 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:54:00 crc kubenswrapper[4711]: I1203 12:54:00.023474 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-8246r"] Dec 03 12:54:00 crc kubenswrapper[4711]: I1203 12:54:00.029779 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-8246r"] Dec 03 12:54:01 crc kubenswrapper[4711]: I1203 12:54:01.833563 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7624667-e238-4d33-940f-34d33d029ad3" path="/var/lib/kubelet/pods/a7624667-e238-4d33-940f-34d33d029ad3/volumes" Dec 03 12:54:07 crc kubenswrapper[4711]: I1203 12:54:07.941024 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zhfzk_f3095a2e-f5c5-436e-953a-1fb6ae1950bb/control-plane-machine-set-operator/0.log" Dec 03 12:54:08 crc kubenswrapper[4711]: I1203 12:54:08.096191 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hnrkf_064e5e0c-68f0-4d05-b054-17948a298623/kube-rbac-proxy/0.log" Dec 03 12:54:08 crc kubenswrapper[4711]: I1203 12:54:08.131491 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hnrkf_064e5e0c-68f0-4d05-b054-17948a298623/machine-api-operator/0.log" Dec 03 12:54:11 crc kubenswrapper[4711]: I1203 12:54:11.823597 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:54:11 crc kubenswrapper[4711]: E1203 12:54:11.824371 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:54:23 crc kubenswrapper[4711]: I1203 12:54:23.817316 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:54:23 crc kubenswrapper[4711]: E1203 12:54:23.818143 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:54:24 crc kubenswrapper[4711]: I1203 12:54:24.396396 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-sflcc_663169ff-9a0b-4c7c-8326-db8528f88c00/kube-rbac-proxy/0.log" Dec 03 12:54:24 crc kubenswrapper[4711]: I1203 12:54:24.479970 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-sflcc_663169ff-9a0b-4c7c-8326-db8528f88c00/controller/0.log" Dec 03 12:54:24 crc kubenswrapper[4711]: I1203 12:54:24.579899 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-frr-files/0.log" Dec 03 12:54:24 crc kubenswrapper[4711]: I1203 12:54:24.779280 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-reloader/0.log" Dec 03 12:54:24 crc kubenswrapper[4711]: I1203 12:54:24.787752 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-metrics/0.log" Dec 03 12:54:24 crc kubenswrapper[4711]: I1203 12:54:24.788020 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-reloader/0.log" Dec 03 12:54:24 crc kubenswrapper[4711]: I1203 12:54:24.788101 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-frr-files/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.005236 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-metrics/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.009933 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-reloader/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.009962 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-frr-files/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.106248 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-metrics/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.239155 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-reloader/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.239960 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-frr-files/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.245094 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/cp-metrics/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.305983 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/controller/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.412278 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/kube-rbac-proxy/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.444519 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/frr-metrics/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.468197 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/kube-rbac-proxy-frr/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.632638 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/reloader/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.695559 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-fr4tv_947998b3-f1a3-486a-91b4-108fcc09af6d/frr-k8s-webhook-server/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.894799 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78f9c874c6-vlmhs_25892076-3b01-4a62-884a-3d658b400d60/manager/0.log" Dec 03 12:54:25 crc kubenswrapper[4711]: I1203 12:54:25.990437 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m42lk_f567c71f-c1ec-47c6-9173-8c1a29524cf8/frr/0.log" Dec 03 12:54:26 crc kubenswrapper[4711]: I1203 12:54:26.080754 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5bc7f8779c-bxqk4_f63d4a78-0d83-4e48-a531-86e871adfa2d/webhook-server/0.log" Dec 03 12:54:26 crc kubenswrapper[4711]: I1203 12:54:26.087621 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9xmmz_f4c50fcb-c298-476e-b6c6-4af490b8a6ed/kube-rbac-proxy/0.log" Dec 03 12:54:26 crc kubenswrapper[4711]: I1203 12:54:26.315161 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9xmmz_f4c50fcb-c298-476e-b6c6-4af490b8a6ed/speaker/0.log" Dec 03 12:54:38 crc kubenswrapper[4711]: I1203 12:54:38.817197 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:54:38 crc kubenswrapper[4711]: E1203 12:54:38.817864 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:54:40 crc kubenswrapper[4711]: I1203 12:54:40.225369 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_keystone-6884479545-5vgqj_a5fda40f-b786-4b92-b502-f8457a37d2aa/keystone-api/0.log" Dec 03 12:54:40 crc kubenswrapper[4711]: I1203 12:54:40.250941 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_9afca0df-f7de-4a6f-88bf-49378019f63d/mysql-bootstrap/0.log" Dec 03 12:54:40 crc kubenswrapper[4711]: I1203 12:54:40.407705 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_9afca0df-f7de-4a6f-88bf-49378019f63d/mysql-bootstrap/0.log" Dec 03 12:54:40 crc kubenswrapper[4711]: I1203 12:54:40.436693 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_9afca0df-f7de-4a6f-88bf-49378019f63d/galera/0.log" Dec 03 12:54:40 crc kubenswrapper[4711]: I1203 12:54:40.628198 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_66535e59-1359-4bb1-bc04-7cf81d7fedc6/mysql-bootstrap/0.log" Dec 03 12:54:40 crc kubenswrapper[4711]: I1203 12:54:40.885686 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_66535e59-1359-4bb1-bc04-7cf81d7fedc6/galera/0.log" Dec 03 12:54:40 crc kubenswrapper[4711]: I1203 12:54:40.922847 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_66535e59-1359-4bb1-bc04-7cf81d7fedc6/mysql-bootstrap/0.log" Dec 03 12:54:41 crc kubenswrapper[4711]: I1203 12:54:41.127922 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_31b86704-7f22-45ea-994f-26a4f953e56b/mysql-bootstrap/0.log" Dec 03 12:54:41 crc kubenswrapper[4711]: I1203 12:54:41.391871 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_31b86704-7f22-45ea-994f-26a4f953e56b/mysql-bootstrap/0.log" Dec 03 12:54:41 crc kubenswrapper[4711]: I1203 12:54:41.496347 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_31b86704-7f22-45ea-994f-26a4f953e56b/galera/0.log" Dec 03 12:54:41 crc kubenswrapper[4711]: I1203 12:54:41.575466 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_0af24c58-d931-49f4-b886-67dd862d0170/openstackclient/0.log" Dec 03 12:54:41 crc kubenswrapper[4711]: I1203 12:54:41.752782 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_3b4ec49f-4f73-4f76-9945-01577848b79c/setup-container/0.log" Dec 03 12:54:41 crc kubenswrapper[4711]: I1203 12:54:41.980058 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_3b4ec49f-4f73-4f76-9945-01577848b79c/setup-container/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.007632 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_3b4ec49f-4f73-4f76-9945-01577848b79c/rabbitmq/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.059765 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_memcached-0_8e6f76a1-658a-47a6-a675-0130bb6cc7a8/memcached/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.188091 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-8cfd9857-v9jnw_8a481af3-5cbd-47a8-a9b5-d8d73feab045/proxy-httpd/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.217257 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-8cfd9857-v9jnw_8a481af3-5cbd-47a8-a9b5-d8d73feab045/proxy-server/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.240518 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-ring-rebalance-tcsg8_4f07a4c6-d466-42b4-aa65-5f58fa554bbd/swift-ring-rebalance/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.393331 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/account-auditor/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.410968 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/account-reaper/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.464915 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/account-replicator/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.558594 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/account-server/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.599069 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/container-replicator/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.633044 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/container-auditor/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.666838 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/container-server/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.752534 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/container-updater/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.767352 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/object-auditor/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.782694 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/object-expirer/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.861170 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/object-replicator/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.940672 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/object-updater/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.945674 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/object-server/0.log" Dec 03 12:54:42 crc kubenswrapper[4711]: I1203 12:54:42.978705 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/rsync/0.log" Dec 03 12:54:43 crc kubenswrapper[4711]: I1203 12:54:43.027998 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_d8773367-0744-42d7-a9ee-8f508d9d9c97/swift-recon-cron/0.log" Dec 03 12:54:47 crc kubenswrapper[4711]: I1203 12:54:47.466726 4711 scope.go:117] "RemoveContainer" containerID="dd72ac794d49686367b52e03b054cb0cd4ccad2c13c72a2725e71b582a673e8a" Dec 03 12:54:47 crc kubenswrapper[4711]: I1203 12:54:47.489225 4711 scope.go:117] "RemoveContainer" containerID="0bd17b4a32a5e2f6b06fe36d4071ff0e6b533d3d83ae46f77836695173b59f8d" Dec 03 12:54:47 crc kubenswrapper[4711]: I1203 12:54:47.531592 4711 scope.go:117] "RemoveContainer" containerID="261c06a2564639d9ed8c4652ec49d459a501d4bcdd68ee33601177a986a52a39" Dec 03 12:54:47 crc kubenswrapper[4711]: I1203 12:54:47.549889 4711 scope.go:117] "RemoveContainer" containerID="e89a7a0e7372b219b5ad194473c98fa7b6d96b5a2f0da2445e19c04fabe75d17" Dec 03 12:54:47 crc kubenswrapper[4711]: I1203 12:54:47.587840 4711 scope.go:117] "RemoveContainer" containerID="9fa0d8c99d2c47b1f770f04652f96fd8b76b35a6947e82f61bb8923cccfd6a3e" Dec 03 12:54:47 crc kubenswrapper[4711]: I1203 12:54:47.604798 4711 scope.go:117] "RemoveContainer" containerID="bcf0a0545b10dccbd48345ce8b39c053d562c3978af8205f0307872a4ae3385a" Dec 03 12:54:53 crc kubenswrapper[4711]: I1203 12:54:53.817955 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:54:53 crc kubenswrapper[4711]: E1203 12:54:53.819953 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.121009 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms_19e4b241-651d-4300-9adb-3ef74168d5f7/util/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.437334 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms_19e4b241-651d-4300-9adb-3ef74168d5f7/util/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.442590 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms_19e4b241-651d-4300-9adb-3ef74168d5f7/pull/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.470332 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms_19e4b241-651d-4300-9adb-3ef74168d5f7/pull/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.579413 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms_19e4b241-651d-4300-9adb-3ef74168d5f7/util/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.634670 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms_19e4b241-651d-4300-9adb-3ef74168d5f7/pull/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.658887 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cvdms_19e4b241-651d-4300-9adb-3ef74168d5f7/extract/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.768125 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gsnk6_888df056-de6f-4c3c-80b3-3029afdcfe85/extract-utilities/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.948217 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gsnk6_888df056-de6f-4c3c-80b3-3029afdcfe85/extract-content/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.958466 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gsnk6_888df056-de6f-4c3c-80b3-3029afdcfe85/extract-utilities/0.log" Dec 03 12:54:55 crc kubenswrapper[4711]: I1203 12:54:55.958507 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gsnk6_888df056-de6f-4c3c-80b3-3029afdcfe85/extract-content/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.102959 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gsnk6_888df056-de6f-4c3c-80b3-3029afdcfe85/extract-utilities/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.166447 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gsnk6_888df056-de6f-4c3c-80b3-3029afdcfe85/extract-content/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.328214 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ccx4z_ef97ed00-d8de-42fd-94ab-8ff1b1569291/extract-utilities/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.447682 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gsnk6_888df056-de6f-4c3c-80b3-3029afdcfe85/registry-server/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.584595 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ccx4z_ef97ed00-d8de-42fd-94ab-8ff1b1569291/extract-utilities/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.608067 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ccx4z_ef97ed00-d8de-42fd-94ab-8ff1b1569291/extract-content/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.616332 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ccx4z_ef97ed00-d8de-42fd-94ab-8ff1b1569291/extract-content/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.835235 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ccx4z_ef97ed00-d8de-42fd-94ab-8ff1b1569291/extract-content/0.log" Dec 03 12:54:56 crc kubenswrapper[4711]: I1203 12:54:56.839066 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ccx4z_ef97ed00-d8de-42fd-94ab-8ff1b1569291/extract-utilities/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.069940 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wrqcm_de20138d-da5d-4975-b68e-cbd74b7be8cb/extract-utilities/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.124417 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-sr2lt_3a0a6697-5cb1-4381-a050-ce15daaeb231/marketplace-operator/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.259015 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ccx4z_ef97ed00-d8de-42fd-94ab-8ff1b1569291/registry-server/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.311587 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wrqcm_de20138d-da5d-4975-b68e-cbd74b7be8cb/extract-utilities/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.337163 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wrqcm_de20138d-da5d-4975-b68e-cbd74b7be8cb/extract-content/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.367750 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wrqcm_de20138d-da5d-4975-b68e-cbd74b7be8cb/extract-content/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.481706 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wrqcm_de20138d-da5d-4975-b68e-cbd74b7be8cb/extract-content/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.544423 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wrqcm_de20138d-da5d-4975-b68e-cbd74b7be8cb/extract-utilities/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.658990 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wrqcm_de20138d-da5d-4975-b68e-cbd74b7be8cb/registry-server/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.736684 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tltll_8b4e35ef-e95e-42b5-ae3d-f5f92bfec867/extract-utilities/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.880753 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tltll_8b4e35ef-e95e-42b5-ae3d-f5f92bfec867/extract-utilities/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.889512 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tltll_8b4e35ef-e95e-42b5-ae3d-f5f92bfec867/extract-content/0.log" Dec 03 12:54:57 crc kubenswrapper[4711]: I1203 12:54:57.901770 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tltll_8b4e35ef-e95e-42b5-ae3d-f5f92bfec867/extract-content/0.log" Dec 03 12:54:58 crc kubenswrapper[4711]: I1203 12:54:58.067722 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tltll_8b4e35ef-e95e-42b5-ae3d-f5f92bfec867/extract-utilities/0.log" Dec 03 12:54:58 crc kubenswrapper[4711]: I1203 12:54:58.090656 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tltll_8b4e35ef-e95e-42b5-ae3d-f5f92bfec867/extract-content/0.log" Dec 03 12:54:58 crc kubenswrapper[4711]: I1203 12:54:58.384788 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tltll_8b4e35ef-e95e-42b5-ae3d-f5f92bfec867/registry-server/0.log" Dec 03 12:55:06 crc kubenswrapper[4711]: I1203 12:55:06.817366 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:55:06 crc kubenswrapper[4711]: E1203 12:55:06.818185 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:55:18 crc kubenswrapper[4711]: I1203 12:55:18.817115 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:55:18 crc kubenswrapper[4711]: E1203 12:55:18.817849 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:55:29 crc kubenswrapper[4711]: I1203 12:55:29.818688 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:55:29 crc kubenswrapper[4711]: E1203 12:55:29.820185 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:55:41 crc kubenswrapper[4711]: I1203 12:55:41.817936 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:55:41 crc kubenswrapper[4711]: E1203 12:55:41.822726 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:55:47 crc kubenswrapper[4711]: I1203 12:55:47.732864 4711 scope.go:117] "RemoveContainer" containerID="4cf4fdd4e81c79e46e70376eee022c59b7215343bd29393a65871cd49825ffe3" Dec 03 12:55:47 crc kubenswrapper[4711]: I1203 12:55:47.762093 4711 scope.go:117] "RemoveContainer" containerID="606b5eecb5a23572e62b5a0263b6dcead6590b063fe003f1ea4ff62851f38799" Dec 03 12:55:47 crc kubenswrapper[4711]: I1203 12:55:47.790394 4711 scope.go:117] "RemoveContainer" containerID="c5ea8bb6ee74cb78f9da8d1839ea6f5a9861ebc94eebd9fb1aa97cd9773f83a5" Dec 03 12:55:47 crc kubenswrapper[4711]: I1203 12:55:47.851089 4711 scope.go:117] "RemoveContainer" containerID="bc7026f116c090900df51fc64f1df0aebfc142a0937d53ed5eeb39d2fc4ace8c" Dec 03 12:55:54 crc kubenswrapper[4711]: I1203 12:55:54.817304 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:55:54 crc kubenswrapper[4711]: E1203 12:55:54.818052 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:56:04 crc kubenswrapper[4711]: I1203 12:56:04.023816 4711 generic.go:334] "Generic (PLEG): container finished" podID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerID="6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682" exitCode=0 Dec 03 12:56:04 crc kubenswrapper[4711]: I1203 12:56:04.023971 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" event={"ID":"cabb321f-81f8-4dad-b86f-5feead1e7c71","Type":"ContainerDied","Data":"6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682"} Dec 03 12:56:04 crc kubenswrapper[4711]: I1203 12:56:04.025458 4711 scope.go:117] "RemoveContainer" containerID="6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682" Dec 03 12:56:04 crc kubenswrapper[4711]: I1203 12:56:04.493802 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l2hvt_must-gather-8k6vl_cabb321f-81f8-4dad-b86f-5feead1e7c71/gather/0.log" Dec 03 12:56:08 crc kubenswrapper[4711]: I1203 12:56:08.817483 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:56:08 crc kubenswrapper[4711]: E1203 12:56:08.818143 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.325280 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l2hvt/must-gather-8k6vl"] Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.326243 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" podUID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerName="copy" containerID="cri-o://f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572" gracePeriod=2 Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.336382 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l2hvt/must-gather-8k6vl"] Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.721966 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l2hvt_must-gather-8k6vl_cabb321f-81f8-4dad-b86f-5feead1e7c71/copy/0.log" Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.723086 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.816413 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cabb321f-81f8-4dad-b86f-5feead1e7c71-must-gather-output\") pod \"cabb321f-81f8-4dad-b86f-5feead1e7c71\" (UID: \"cabb321f-81f8-4dad-b86f-5feead1e7c71\") " Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.816505 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmhzc\" (UniqueName: \"kubernetes.io/projected/cabb321f-81f8-4dad-b86f-5feead1e7c71-kube-api-access-wmhzc\") pod \"cabb321f-81f8-4dad-b86f-5feead1e7c71\" (UID: \"cabb321f-81f8-4dad-b86f-5feead1e7c71\") " Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.823105 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabb321f-81f8-4dad-b86f-5feead1e7c71-kube-api-access-wmhzc" (OuterVolumeSpecName: "kube-api-access-wmhzc") pod "cabb321f-81f8-4dad-b86f-5feead1e7c71" (UID: "cabb321f-81f8-4dad-b86f-5feead1e7c71"). InnerVolumeSpecName "kube-api-access-wmhzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.892516 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cabb321f-81f8-4dad-b86f-5feead1e7c71-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "cabb321f-81f8-4dad-b86f-5feead1e7c71" (UID: "cabb321f-81f8-4dad-b86f-5feead1e7c71"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.919464 4711 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cabb321f-81f8-4dad-b86f-5feead1e7c71-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:11 crc kubenswrapper[4711]: I1203 12:56:11.919500 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmhzc\" (UniqueName: \"kubernetes.io/projected/cabb321f-81f8-4dad-b86f-5feead1e7c71-kube-api-access-wmhzc\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.091482 4711 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l2hvt_must-gather-8k6vl_cabb321f-81f8-4dad-b86f-5feead1e7c71/copy/0.log" Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.092375 4711 generic.go:334] "Generic (PLEG): container finished" podID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerID="f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572" exitCode=143 Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.092430 4711 scope.go:117] "RemoveContainer" containerID="f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572" Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.092565 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l2hvt/must-gather-8k6vl" Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.129213 4711 scope.go:117] "RemoveContainer" containerID="6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682" Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.219839 4711 scope.go:117] "RemoveContainer" containerID="f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572" Dec 03 12:56:12 crc kubenswrapper[4711]: E1203 12:56:12.222245 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572\": container with ID starting with f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572 not found: ID does not exist" containerID="f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572" Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.222295 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572"} err="failed to get container status \"f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572\": rpc error: code = NotFound desc = could not find container \"f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572\": container with ID starting with f69f752d495101034e6450769163753313cb74a392b55494932f5337916db572 not found: ID does not exist" Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.222321 4711 scope.go:117] "RemoveContainer" containerID="6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682" Dec 03 12:56:12 crc kubenswrapper[4711]: E1203 12:56:12.222743 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682\": container with ID starting with 6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682 not found: ID does not exist" containerID="6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682" Dec 03 12:56:12 crc kubenswrapper[4711]: I1203 12:56:12.222758 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682"} err="failed to get container status \"6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682\": rpc error: code = NotFound desc = could not find container \"6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682\": container with ID starting with 6e6bfe961a23d19b269d3260507cf761d111a9e509a0e134381c3405f6970682 not found: ID does not exist" Dec 03 12:56:13 crc kubenswrapper[4711]: I1203 12:56:13.830397 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cabb321f-81f8-4dad-b86f-5feead1e7c71" path="/var/lib/kubelet/pods/cabb321f-81f8-4dad-b86f-5feead1e7c71/volumes" Dec 03 12:56:23 crc kubenswrapper[4711]: I1203 12:56:23.819041 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:56:23 crc kubenswrapper[4711]: E1203 12:56:23.819788 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.575327 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rfh4j"] Dec 03 12:56:29 crc kubenswrapper[4711]: E1203 12:56:29.576145 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerName="gather" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.576159 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerName="gather" Dec 03 12:56:29 crc kubenswrapper[4711]: E1203 12:56:29.576188 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerName="copy" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.576194 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerName="copy" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.576322 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerName="gather" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.576339 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="cabb321f-81f8-4dad-b86f-5feead1e7c71" containerName="copy" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.577367 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.594669 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rfh4j"] Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.699553 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhb2\" (UniqueName: \"kubernetes.io/projected/03477aa2-7bbd-46ba-a023-7563ec98f8e3-kube-api-access-fjhb2\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.699636 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-utilities\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.699713 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-catalog-content\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.801057 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhb2\" (UniqueName: \"kubernetes.io/projected/03477aa2-7bbd-46ba-a023-7563ec98f8e3-kube-api-access-fjhb2\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.801120 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-utilities\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.801158 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-catalog-content\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.801718 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-catalog-content\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.801712 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-utilities\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.827969 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhb2\" (UniqueName: \"kubernetes.io/projected/03477aa2-7bbd-46ba-a023-7563ec98f8e3-kube-api-access-fjhb2\") pod \"redhat-operators-rfh4j\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:29 crc kubenswrapper[4711]: I1203 12:56:29.900698 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:30 crc kubenswrapper[4711]: I1203 12:56:30.354674 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rfh4j"] Dec 03 12:56:31 crc kubenswrapper[4711]: I1203 12:56:31.249936 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfh4j" event={"ID":"03477aa2-7bbd-46ba-a023-7563ec98f8e3","Type":"ContainerStarted","Data":"dcdf6bb4b9649a3750b9703b3e81b9956c7e44fd61f91b172dab6710e0103680"} Dec 03 12:56:32 crc kubenswrapper[4711]: I1203 12:56:32.259143 4711 generic.go:334] "Generic (PLEG): container finished" podID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerID="c2158ed9ad11a48cb51a47d0dbe38b949a7835d084b35df585a7c19aa0aeff07" exitCode=0 Dec 03 12:56:32 crc kubenswrapper[4711]: I1203 12:56:32.259227 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfh4j" event={"ID":"03477aa2-7bbd-46ba-a023-7563ec98f8e3","Type":"ContainerDied","Data":"c2158ed9ad11a48cb51a47d0dbe38b949a7835d084b35df585a7c19aa0aeff07"} Dec 03 12:56:32 crc kubenswrapper[4711]: I1203 12:56:32.262615 4711 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:56:34 crc kubenswrapper[4711]: I1203 12:56:34.276614 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfh4j" event={"ID":"03477aa2-7bbd-46ba-a023-7563ec98f8e3","Type":"ContainerStarted","Data":"a76e33e00ccc07479f1db466e3957648ecc245d1f716e9b5fc8dde3868407671"} Dec 03 12:56:34 crc kubenswrapper[4711]: I1203 12:56:34.817559 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:56:34 crc kubenswrapper[4711]: E1203 12:56:34.817983 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:56:35 crc kubenswrapper[4711]: I1203 12:56:35.288150 4711 generic.go:334] "Generic (PLEG): container finished" podID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerID="a76e33e00ccc07479f1db466e3957648ecc245d1f716e9b5fc8dde3868407671" exitCode=0 Dec 03 12:56:35 crc kubenswrapper[4711]: I1203 12:56:35.288997 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfh4j" event={"ID":"03477aa2-7bbd-46ba-a023-7563ec98f8e3","Type":"ContainerDied","Data":"a76e33e00ccc07479f1db466e3957648ecc245d1f716e9b5fc8dde3868407671"} Dec 03 12:56:36 crc kubenswrapper[4711]: I1203 12:56:36.301389 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfh4j" event={"ID":"03477aa2-7bbd-46ba-a023-7563ec98f8e3","Type":"ContainerStarted","Data":"5c7ee17de239bf8fa9d3d0ef78e9313be22a4c8e94d2285fcf7fb5a0811e92f6"} Dec 03 12:56:36 crc kubenswrapper[4711]: I1203 12:56:36.325721 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rfh4j" podStartSLOduration=3.502511442 podStartE2EDuration="7.325703226s" podCreationTimestamp="2025-12-03 12:56:29 +0000 UTC" firstStartedPulling="2025-12-03 12:56:32.26239555 +0000 UTC m=+2510.931646805" lastFinishedPulling="2025-12-03 12:56:36.085587314 +0000 UTC m=+2514.754838589" observedRunningTime="2025-12-03 12:56:36.321802248 +0000 UTC m=+2514.991053543" watchObservedRunningTime="2025-12-03 12:56:36.325703226 +0000 UTC m=+2514.994954481" Dec 03 12:56:39 crc kubenswrapper[4711]: I1203 12:56:39.901790 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:39 crc kubenswrapper[4711]: I1203 12:56:39.902300 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:40 crc kubenswrapper[4711]: I1203 12:56:40.960221 4711 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rfh4j" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="registry-server" probeResult="failure" output=< Dec 03 12:56:40 crc kubenswrapper[4711]: timeout: failed to connect service ":50051" within 1s Dec 03 12:56:40 crc kubenswrapper[4711]: > Dec 03 12:56:45 crc kubenswrapper[4711]: I1203 12:56:45.818476 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:56:45 crc kubenswrapper[4711]: E1203 12:56:45.819235 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:56:47 crc kubenswrapper[4711]: I1203 12:56:47.937688 4711 scope.go:117] "RemoveContainer" containerID="69bf6bbc0e3b8b2d55660972252e816798aee39d3a5ddc6eb78a284efd29721c" Dec 03 12:56:47 crc kubenswrapper[4711]: I1203 12:56:47.969323 4711 scope.go:117] "RemoveContainer" containerID="6bce48f682c206b57606dcb50948562f629ae57ce54131a03311755ba2859454" Dec 03 12:56:48 crc kubenswrapper[4711]: I1203 12:56:48.028783 4711 scope.go:117] "RemoveContainer" containerID="00512b127b4de6603bd549e8a899549fc0a1fb96678d6cb33014e2d9fdeaf07a" Dec 03 12:56:48 crc kubenswrapper[4711]: I1203 12:56:48.091035 4711 scope.go:117] "RemoveContainer" containerID="e11b1a0d824fb16772da6315083e47a8071c5073a511a2cc140db2459927701e" Dec 03 12:56:49 crc kubenswrapper[4711]: I1203 12:56:49.958263 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:50 crc kubenswrapper[4711]: I1203 12:56:50.011584 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:50 crc kubenswrapper[4711]: I1203 12:56:50.212667 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rfh4j"] Dec 03 12:56:51 crc kubenswrapper[4711]: I1203 12:56:51.432315 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rfh4j" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="registry-server" containerID="cri-o://5c7ee17de239bf8fa9d3d0ef78e9313be22a4c8e94d2285fcf7fb5a0811e92f6" gracePeriod=2 Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.453954 4711 generic.go:334] "Generic (PLEG): container finished" podID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerID="5c7ee17de239bf8fa9d3d0ef78e9313be22a4c8e94d2285fcf7fb5a0811e92f6" exitCode=0 Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.454014 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfh4j" event={"ID":"03477aa2-7bbd-46ba-a023-7563ec98f8e3","Type":"ContainerDied","Data":"5c7ee17de239bf8fa9d3d0ef78e9313be22a4c8e94d2285fcf7fb5a0811e92f6"} Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.669725 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.771973 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-utilities\") pod \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.772056 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-catalog-content\") pod \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.772175 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjhb2\" (UniqueName: \"kubernetes.io/projected/03477aa2-7bbd-46ba-a023-7563ec98f8e3-kube-api-access-fjhb2\") pod \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\" (UID: \"03477aa2-7bbd-46ba-a023-7563ec98f8e3\") " Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.773761 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-utilities" (OuterVolumeSpecName: "utilities") pod "03477aa2-7bbd-46ba-a023-7563ec98f8e3" (UID: "03477aa2-7bbd-46ba-a023-7563ec98f8e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.777233 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03477aa2-7bbd-46ba-a023-7563ec98f8e3-kube-api-access-fjhb2" (OuterVolumeSpecName: "kube-api-access-fjhb2") pod "03477aa2-7bbd-46ba-a023-7563ec98f8e3" (UID: "03477aa2-7bbd-46ba-a023-7563ec98f8e3"). InnerVolumeSpecName "kube-api-access-fjhb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.873776 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.873819 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjhb2\" (UniqueName: \"kubernetes.io/projected/03477aa2-7bbd-46ba-a023-7563ec98f8e3-kube-api-access-fjhb2\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.888794 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03477aa2-7bbd-46ba-a023-7563ec98f8e3" (UID: "03477aa2-7bbd-46ba-a023-7563ec98f8e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:56:53 crc kubenswrapper[4711]: I1203 12:56:53.975445 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03477aa2-7bbd-46ba-a023-7563ec98f8e3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:54 crc kubenswrapper[4711]: I1203 12:56:54.466312 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfh4j" event={"ID":"03477aa2-7bbd-46ba-a023-7563ec98f8e3","Type":"ContainerDied","Data":"dcdf6bb4b9649a3750b9703b3e81b9956c7e44fd61f91b172dab6710e0103680"} Dec 03 12:56:54 crc kubenswrapper[4711]: I1203 12:56:54.466390 4711 scope.go:117] "RemoveContainer" containerID="5c7ee17de239bf8fa9d3d0ef78e9313be22a4c8e94d2285fcf7fb5a0811e92f6" Dec 03 12:56:54 crc kubenswrapper[4711]: I1203 12:56:54.466447 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfh4j" Dec 03 12:56:54 crc kubenswrapper[4711]: I1203 12:56:54.494422 4711 scope.go:117] "RemoveContainer" containerID="a76e33e00ccc07479f1db466e3957648ecc245d1f716e9b5fc8dde3868407671" Dec 03 12:56:54 crc kubenswrapper[4711]: I1203 12:56:54.523631 4711 scope.go:117] "RemoveContainer" containerID="c2158ed9ad11a48cb51a47d0dbe38b949a7835d084b35df585a7c19aa0aeff07" Dec 03 12:56:54 crc kubenswrapper[4711]: I1203 12:56:54.534255 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rfh4j"] Dec 03 12:56:54 crc kubenswrapper[4711]: I1203 12:56:54.542489 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rfh4j"] Dec 03 12:56:55 crc kubenswrapper[4711]: I1203 12:56:55.825988 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" path="/var/lib/kubelet/pods/03477aa2-7bbd-46ba-a023-7563ec98f8e3/volumes" Dec 03 12:57:00 crc kubenswrapper[4711]: I1203 12:57:00.817301 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:57:00 crc kubenswrapper[4711]: E1203 12:57:00.819248 4711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-52jgg_openshift-machine-config-operator(776e7d35-d59b-4d4a-97cd-aec4f2441c1e)\"" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" Dec 03 12:57:12 crc kubenswrapper[4711]: I1203 12:57:12.817839 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 12:57:13 crc kubenswrapper[4711]: I1203 12:57:13.622673 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"076734f6b97d3ea3a4f6b67e145b0fada6a5ce876a0fbb325602c953ddd0000f"} Dec 03 12:57:48 crc kubenswrapper[4711]: I1203 12:57:48.178042 4711 scope.go:117] "RemoveContainer" containerID="2f0b6add244cc20c53f8a0ac35ea7317a2cd7e7f292ea8a90caa41ba4b72d187" Dec 03 12:57:48 crc kubenswrapper[4711]: I1203 12:57:48.233481 4711 scope.go:117] "RemoveContainer" containerID="bdcfcd5dde4ffac8e4b1493dbbbfd60a6e12d59e046542812db8f50d8cf65d06" Dec 03 12:57:48 crc kubenswrapper[4711]: I1203 12:57:48.255482 4711 scope.go:117] "RemoveContainer" containerID="8642c90613c5ec654cbdb083f5b9203693c9cd0e2a0e928a42441d179c5e2162" Dec 03 12:57:48 crc kubenswrapper[4711]: I1203 12:57:48.272166 4711 scope.go:117] "RemoveContainer" containerID="6df0dc0ea55c01dcfd7b5c6f8501d91d49b25669cba8966664833c8117515079" Dec 03 12:57:48 crc kubenswrapper[4711]: I1203 12:57:48.300533 4711 scope.go:117] "RemoveContainer" containerID="355f127ffd8448aceba29c3b05d09b59993e407776d176dd53889aba772c6a23" Dec 03 12:57:48 crc kubenswrapper[4711]: I1203 12:57:48.326256 4711 scope.go:117] "RemoveContainer" containerID="6baa336b4bc24571c7bee7a979cd709c72b3854c08c6077dad7bab78b7307fdf" Dec 03 12:58:48 crc kubenswrapper[4711]: I1203 12:58:48.438138 4711 scope.go:117] "RemoveContainer" containerID="f7de3240d0d84284c868389b052511cc57e6fff6a6d4ff4144ecd47aca58b96b" Dec 03 12:59:35 crc kubenswrapper[4711]: I1203 12:59:35.402039 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:59:35 crc kubenswrapper[4711]: I1203 12:59:35.402835 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.632139 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-db8gg"] Dec 03 12:59:39 crc kubenswrapper[4711]: E1203 12:59:39.632800 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="extract-content" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.632819 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="extract-content" Dec 03 12:59:39 crc kubenswrapper[4711]: E1203 12:59:39.632865 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="registry-server" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.632876 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="registry-server" Dec 03 12:59:39 crc kubenswrapper[4711]: E1203 12:59:39.632895 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="extract-utilities" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.632926 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="extract-utilities" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.633107 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="03477aa2-7bbd-46ba-a023-7563ec98f8e3" containerName="registry-server" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.634344 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.647228 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-db8gg"] Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.777734 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-utilities\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.777812 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frb2p\" (UniqueName: \"kubernetes.io/projected/17794825-1aa3-4515-b4e5-8e7ada16bdbc-kube-api-access-frb2p\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.777842 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-catalog-content\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.878869 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-utilities\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.879012 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frb2p\" (UniqueName: \"kubernetes.io/projected/17794825-1aa3-4515-b4e5-8e7ada16bdbc-kube-api-access-frb2p\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.879044 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-catalog-content\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.879572 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-utilities\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.879591 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-catalog-content\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.899296 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frb2p\" (UniqueName: \"kubernetes.io/projected/17794825-1aa3-4515-b4e5-8e7ada16bdbc-kube-api-access-frb2p\") pod \"redhat-marketplace-db8gg\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:39 crc kubenswrapper[4711]: I1203 12:59:39.955397 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:40 crc kubenswrapper[4711]: I1203 12:59:40.390695 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-db8gg"] Dec 03 12:59:40 crc kubenswrapper[4711]: I1203 12:59:40.921340 4711 generic.go:334] "Generic (PLEG): container finished" podID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerID="de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a" exitCode=0 Dec 03 12:59:40 crc kubenswrapper[4711]: I1203 12:59:40.921550 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-db8gg" event={"ID":"17794825-1aa3-4515-b4e5-8e7ada16bdbc","Type":"ContainerDied","Data":"de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a"} Dec 03 12:59:40 crc kubenswrapper[4711]: I1203 12:59:40.923072 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-db8gg" event={"ID":"17794825-1aa3-4515-b4e5-8e7ada16bdbc","Type":"ContainerStarted","Data":"eef87cf1afcc5f2b2ec9c57f8e7f8e2f56c1c992c63a49c766a38bf5f92eb096"} Dec 03 12:59:41 crc kubenswrapper[4711]: I1203 12:59:41.931200 4711 generic.go:334] "Generic (PLEG): container finished" podID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerID="581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27" exitCode=0 Dec 03 12:59:41 crc kubenswrapper[4711]: I1203 12:59:41.931264 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-db8gg" event={"ID":"17794825-1aa3-4515-b4e5-8e7ada16bdbc","Type":"ContainerDied","Data":"581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27"} Dec 03 12:59:43 crc kubenswrapper[4711]: I1203 12:59:43.948230 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-db8gg" event={"ID":"17794825-1aa3-4515-b4e5-8e7ada16bdbc","Type":"ContainerStarted","Data":"02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7"} Dec 03 12:59:43 crc kubenswrapper[4711]: I1203 12:59:43.970651 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-db8gg" podStartSLOduration=2.431736723 podStartE2EDuration="4.970632306s" podCreationTimestamp="2025-12-03 12:59:39 +0000 UTC" firstStartedPulling="2025-12-03 12:59:40.923110907 +0000 UTC m=+2699.592362162" lastFinishedPulling="2025-12-03 12:59:43.46200645 +0000 UTC m=+2702.131257745" observedRunningTime="2025-12-03 12:59:43.965273788 +0000 UTC m=+2702.634525053" watchObservedRunningTime="2025-12-03 12:59:43.970632306 +0000 UTC m=+2702.639883551" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.089162 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4xzj6"] Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.091691 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.110987 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4xzj6"] Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.220547 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp5rp\" (UniqueName: \"kubernetes.io/projected/e5561ca1-6740-4603-ab70-aa59f825fef6-kube-api-access-dp5rp\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.220624 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-utilities\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.220705 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-catalog-content\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.321701 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-catalog-content\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.321774 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp5rp\" (UniqueName: \"kubernetes.io/projected/e5561ca1-6740-4603-ab70-aa59f825fef6-kube-api-access-dp5rp\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.321810 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-utilities\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.322182 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-catalog-content\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.322206 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-utilities\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.340578 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp5rp\" (UniqueName: \"kubernetes.io/projected/e5561ca1-6740-4603-ab70-aa59f825fef6-kube-api-access-dp5rp\") pod \"community-operators-4xzj6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.457135 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.732357 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4xzj6"] Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.955817 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.956211 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.994830 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xzj6" event={"ID":"e5561ca1-6740-4603-ab70-aa59f825fef6","Type":"ContainerStarted","Data":"b75f8b5c497e502eec37e142eb21970ccfbb16057243439397317a4216af78c5"} Dec 03 12:59:49 crc kubenswrapper[4711]: I1203 12:59:49.999418 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:51 crc kubenswrapper[4711]: I1203 12:59:51.069491 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:52 crc kubenswrapper[4711]: I1203 12:59:52.011386 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xzj6" event={"ID":"e5561ca1-6740-4603-ab70-aa59f825fef6","Type":"ContainerStarted","Data":"94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9"} Dec 03 12:59:52 crc kubenswrapper[4711]: I1203 12:59:52.275426 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-db8gg"] Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.023755 4711 generic.go:334] "Generic (PLEG): container finished" podID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerID="94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9" exitCode=0 Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.024688 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-db8gg" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerName="registry-server" containerID="cri-o://02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7" gracePeriod=2 Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.027646 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xzj6" event={"ID":"e5561ca1-6740-4603-ab70-aa59f825fef6","Type":"ContainerDied","Data":"94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9"} Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.390958 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.524201 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-catalog-content\") pod \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.524279 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frb2p\" (UniqueName: \"kubernetes.io/projected/17794825-1aa3-4515-b4e5-8e7ada16bdbc-kube-api-access-frb2p\") pod \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.524442 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-utilities\") pod \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\" (UID: \"17794825-1aa3-4515-b4e5-8e7ada16bdbc\") " Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.525711 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-utilities" (OuterVolumeSpecName: "utilities") pod "17794825-1aa3-4515-b4e5-8e7ada16bdbc" (UID: "17794825-1aa3-4515-b4e5-8e7ada16bdbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.530468 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17794825-1aa3-4515-b4e5-8e7ada16bdbc-kube-api-access-frb2p" (OuterVolumeSpecName: "kube-api-access-frb2p") pod "17794825-1aa3-4515-b4e5-8e7ada16bdbc" (UID: "17794825-1aa3-4515-b4e5-8e7ada16bdbc"). InnerVolumeSpecName "kube-api-access-frb2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.542039 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17794825-1aa3-4515-b4e5-8e7ada16bdbc" (UID: "17794825-1aa3-4515-b4e5-8e7ada16bdbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.625701 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.625733 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17794825-1aa3-4515-b4e5-8e7ada16bdbc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:59:53 crc kubenswrapper[4711]: I1203 12:59:53.625743 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frb2p\" (UniqueName: \"kubernetes.io/projected/17794825-1aa3-4515-b4e5-8e7ada16bdbc-kube-api-access-frb2p\") on node \"crc\" DevicePath \"\"" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.035314 4711 generic.go:334] "Generic (PLEG): container finished" podID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerID="02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7" exitCode=0 Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.035361 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-db8gg" event={"ID":"17794825-1aa3-4515-b4e5-8e7ada16bdbc","Type":"ContainerDied","Data":"02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7"} Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.035399 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-db8gg" event={"ID":"17794825-1aa3-4515-b4e5-8e7ada16bdbc","Type":"ContainerDied","Data":"eef87cf1afcc5f2b2ec9c57f8e7f8e2f56c1c992c63a49c766a38bf5f92eb096"} Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.035424 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-db8gg" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.035435 4711 scope.go:117] "RemoveContainer" containerID="02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.063819 4711 scope.go:117] "RemoveContainer" containerID="581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.065631 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-db8gg"] Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.072323 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-db8gg"] Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.103416 4711 scope.go:117] "RemoveContainer" containerID="de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.125652 4711 scope.go:117] "RemoveContainer" containerID="02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7" Dec 03 12:59:54 crc kubenswrapper[4711]: E1203 12:59:54.126147 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7\": container with ID starting with 02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7 not found: ID does not exist" containerID="02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.126199 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7"} err="failed to get container status \"02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7\": rpc error: code = NotFound desc = could not find container \"02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7\": container with ID starting with 02b65ce95494593481bddb932233cc73f5bc8e629b43134efc57c14c46e2b0e7 not found: ID does not exist" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.126230 4711 scope.go:117] "RemoveContainer" containerID="581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27" Dec 03 12:59:54 crc kubenswrapper[4711]: E1203 12:59:54.128173 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27\": container with ID starting with 581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27 not found: ID does not exist" containerID="581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.128224 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27"} err="failed to get container status \"581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27\": rpc error: code = NotFound desc = could not find container \"581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27\": container with ID starting with 581f3d886db7dd8b30d266115366076653cc083e29e1021fff2de4780d776a27 not found: ID does not exist" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.128257 4711 scope.go:117] "RemoveContainer" containerID="de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a" Dec 03 12:59:54 crc kubenswrapper[4711]: E1203 12:59:54.128667 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a\": container with ID starting with de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a not found: ID does not exist" containerID="de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a" Dec 03 12:59:54 crc kubenswrapper[4711]: I1203 12:59:54.128699 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a"} err="failed to get container status \"de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a\": rpc error: code = NotFound desc = could not find container \"de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a\": container with ID starting with de5f681a6b41a2046643ec38219f974f6c080730a8613266ff28ab09ce0f7a7a not found: ID does not exist" Dec 03 12:59:55 crc kubenswrapper[4711]: I1203 12:59:55.048984 4711 generic.go:334] "Generic (PLEG): container finished" podID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerID="49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607" exitCode=0 Dec 03 12:59:55 crc kubenswrapper[4711]: I1203 12:59:55.049308 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xzj6" event={"ID":"e5561ca1-6740-4603-ab70-aa59f825fef6","Type":"ContainerDied","Data":"49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607"} Dec 03 12:59:55 crc kubenswrapper[4711]: I1203 12:59:55.829500 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" path="/var/lib/kubelet/pods/17794825-1aa3-4515-b4e5-8e7ada16bdbc/volumes" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.685838 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7m675"] Dec 03 12:59:56 crc kubenswrapper[4711]: E1203 12:59:56.686201 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerName="extract-content" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.686235 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerName="extract-content" Dec 03 12:59:56 crc kubenswrapper[4711]: E1203 12:59:56.686251 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerName="registry-server" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.686259 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerName="registry-server" Dec 03 12:59:56 crc kubenswrapper[4711]: E1203 12:59:56.686275 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerName="extract-utilities" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.686282 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerName="extract-utilities" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.686435 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="17794825-1aa3-4515-b4e5-8e7ada16bdbc" containerName="registry-server" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.687416 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.693219 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7m675"] Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.774268 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-utilities\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.774438 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sk6r\" (UniqueName: \"kubernetes.io/projected/6ab52750-c6d7-4c03-bc91-828ab04ba678-kube-api-access-9sk6r\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.774480 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-catalog-content\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.875675 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sk6r\" (UniqueName: \"kubernetes.io/projected/6ab52750-c6d7-4c03-bc91-828ab04ba678-kube-api-access-9sk6r\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.875714 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-catalog-content\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.875774 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-utilities\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.876211 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-utilities\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.876394 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-catalog-content\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:56 crc kubenswrapper[4711]: I1203 12:59:56.900117 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sk6r\" (UniqueName: \"kubernetes.io/projected/6ab52750-c6d7-4c03-bc91-828ab04ba678-kube-api-access-9sk6r\") pod \"certified-operators-7m675\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:57 crc kubenswrapper[4711]: I1203 12:59:57.004388 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m675" Dec 03 12:59:57 crc kubenswrapper[4711]: I1203 12:59:57.079685 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xzj6" event={"ID":"e5561ca1-6740-4603-ab70-aa59f825fef6","Type":"ContainerStarted","Data":"047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972"} Dec 03 12:59:57 crc kubenswrapper[4711]: I1203 12:59:57.120887 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4xzj6" podStartSLOduration=4.828925928 podStartE2EDuration="8.120867143s" podCreationTimestamp="2025-12-03 12:59:49 +0000 UTC" firstStartedPulling="2025-12-03 12:59:53.031044462 +0000 UTC m=+2711.700295717" lastFinishedPulling="2025-12-03 12:59:56.322985677 +0000 UTC m=+2714.992236932" observedRunningTime="2025-12-03 12:59:57.105284894 +0000 UTC m=+2715.774536179" watchObservedRunningTime="2025-12-03 12:59:57.120867143 +0000 UTC m=+2715.790118398" Dec 03 12:59:57 crc kubenswrapper[4711]: I1203 12:59:57.412446 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7m675"] Dec 03 12:59:58 crc kubenswrapper[4711]: I1203 12:59:58.089158 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m675" event={"ID":"6ab52750-c6d7-4c03-bc91-828ab04ba678","Type":"ContainerStarted","Data":"23c84dbe98bca2337b42a5b0935c29bf9ef67df7930ad1c0cffdd9f9eff3d82a"} Dec 03 12:59:59 crc kubenswrapper[4711]: I1203 12:59:59.099055 4711 generic.go:334] "Generic (PLEG): container finished" podID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerID="643ae2bc2629a8aec7776cba7cd95c73314e3a230ea9b6c3d1abb9f156a69656" exitCode=0 Dec 03 12:59:59 crc kubenswrapper[4711]: I1203 12:59:59.099147 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m675" event={"ID":"6ab52750-c6d7-4c03-bc91-828ab04ba678","Type":"ContainerDied","Data":"643ae2bc2629a8aec7776cba7cd95c73314e3a230ea9b6c3d1abb9f156a69656"} Dec 03 12:59:59 crc kubenswrapper[4711]: I1203 12:59:59.457593 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:59 crc kubenswrapper[4711]: I1203 12:59:59.457645 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4xzj6" Dec 03 12:59:59 crc kubenswrapper[4711]: I1203 12:59:59.500510 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4xzj6" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.151828 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd"] Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.153012 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.156223 4711 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.156388 4711 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.182012 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd"] Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.330390 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21274fc9-6952-4e33-9614-b2ecece2bd8a-secret-volume\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.330565 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21274fc9-6952-4e33-9614-b2ecece2bd8a-config-volume\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.330669 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zv9d\" (UniqueName: \"kubernetes.io/projected/21274fc9-6952-4e33-9614-b2ecece2bd8a-kube-api-access-5zv9d\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.432466 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21274fc9-6952-4e33-9614-b2ecece2bd8a-secret-volume\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.432559 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21274fc9-6952-4e33-9614-b2ecece2bd8a-config-volume\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.432601 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zv9d\" (UniqueName: \"kubernetes.io/projected/21274fc9-6952-4e33-9614-b2ecece2bd8a-kube-api-access-5zv9d\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.433712 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21274fc9-6952-4e33-9614-b2ecece2bd8a-config-volume\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.438468 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21274fc9-6952-4e33-9614-b2ecece2bd8a-secret-volume\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.449178 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zv9d\" (UniqueName: \"kubernetes.io/projected/21274fc9-6952-4e33-9614-b2ecece2bd8a-kube-api-access-5zv9d\") pod \"collect-profiles-29412780-rtmhd\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.474022 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:00 crc kubenswrapper[4711]: I1203 13:00:00.686683 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd"] Dec 03 13:00:00 crc kubenswrapper[4711]: E1203 13:00:00.987445 4711 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ab52750_c6d7_4c03_bc91_828ab04ba678.slice/crio-4fae6674341c56b0f18ec655d1f8273964cc50cacbb29a422fa0baf6696e57bf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ab52750_c6d7_4c03_bc91_828ab04ba678.slice/crio-conmon-4fae6674341c56b0f18ec655d1f8273964cc50cacbb29a422fa0baf6696e57bf.scope\": RecentStats: unable to find data in memory cache]" Dec 03 13:00:01 crc kubenswrapper[4711]: I1203 13:00:01.117380 4711 generic.go:334] "Generic (PLEG): container finished" podID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerID="4fae6674341c56b0f18ec655d1f8273964cc50cacbb29a422fa0baf6696e57bf" exitCode=0 Dec 03 13:00:01 crc kubenswrapper[4711]: I1203 13:00:01.117425 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m675" event={"ID":"6ab52750-c6d7-4c03-bc91-828ab04ba678","Type":"ContainerDied","Data":"4fae6674341c56b0f18ec655d1f8273964cc50cacbb29a422fa0baf6696e57bf"} Dec 03 13:00:01 crc kubenswrapper[4711]: I1203 13:00:01.119272 4711 generic.go:334] "Generic (PLEG): container finished" podID="21274fc9-6952-4e33-9614-b2ecece2bd8a" containerID="0a1b9543fdd8e54a9365e1c5124708e07a417ff9129fe407917391f27ea8fa11" exitCode=0 Dec 03 13:00:01 crc kubenswrapper[4711]: I1203 13:00:01.119304 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" event={"ID":"21274fc9-6952-4e33-9614-b2ecece2bd8a","Type":"ContainerDied","Data":"0a1b9543fdd8e54a9365e1c5124708e07a417ff9129fe407917391f27ea8fa11"} Dec 03 13:00:01 crc kubenswrapper[4711]: I1203 13:00:01.119323 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" event={"ID":"21274fc9-6952-4e33-9614-b2ecece2bd8a","Type":"ContainerStarted","Data":"9de8d8e7341b7c0d20691669305a7ede4d9e84989360643eecd0454303a51b92"} Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.132738 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m675" event={"ID":"6ab52750-c6d7-4c03-bc91-828ab04ba678","Type":"ContainerStarted","Data":"c5cbca05f7c61264ad0579fd3857012a1e080880744b9c2ffd667b216e6d9c22"} Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.157788 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7m675" podStartSLOduration=3.584628619 podStartE2EDuration="6.157766294s" podCreationTimestamp="2025-12-03 12:59:56 +0000 UTC" firstStartedPulling="2025-12-03 12:59:59.101374861 +0000 UTC m=+2717.770626116" lastFinishedPulling="2025-12-03 13:00:01.674512536 +0000 UTC m=+2720.343763791" observedRunningTime="2025-12-03 13:00:02.15619663 +0000 UTC m=+2720.825447905" watchObservedRunningTime="2025-12-03 13:00:02.157766294 +0000 UTC m=+2720.827017559" Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.462646 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.565279 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21274fc9-6952-4e33-9614-b2ecece2bd8a-config-volume\") pod \"21274fc9-6952-4e33-9614-b2ecece2bd8a\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.566013 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21274fc9-6952-4e33-9614-b2ecece2bd8a-config-volume" (OuterVolumeSpecName: "config-volume") pod "21274fc9-6952-4e33-9614-b2ecece2bd8a" (UID: "21274fc9-6952-4e33-9614-b2ecece2bd8a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.567660 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21274fc9-6952-4e33-9614-b2ecece2bd8a-secret-volume\") pod \"21274fc9-6952-4e33-9614-b2ecece2bd8a\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.567737 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zv9d\" (UniqueName: \"kubernetes.io/projected/21274fc9-6952-4e33-9614-b2ecece2bd8a-kube-api-access-5zv9d\") pod \"21274fc9-6952-4e33-9614-b2ecece2bd8a\" (UID: \"21274fc9-6952-4e33-9614-b2ecece2bd8a\") " Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.568264 4711 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21274fc9-6952-4e33-9614-b2ecece2bd8a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.573800 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21274fc9-6952-4e33-9614-b2ecece2bd8a-kube-api-access-5zv9d" (OuterVolumeSpecName: "kube-api-access-5zv9d") pod "21274fc9-6952-4e33-9614-b2ecece2bd8a" (UID: "21274fc9-6952-4e33-9614-b2ecece2bd8a"). InnerVolumeSpecName "kube-api-access-5zv9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.573963 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21274fc9-6952-4e33-9614-b2ecece2bd8a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "21274fc9-6952-4e33-9614-b2ecece2bd8a" (UID: "21274fc9-6952-4e33-9614-b2ecece2bd8a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.670282 4711 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21274fc9-6952-4e33-9614-b2ecece2bd8a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:02 crc kubenswrapper[4711]: I1203 13:00:02.670319 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zv9d\" (UniqueName: \"kubernetes.io/projected/21274fc9-6952-4e33-9614-b2ecece2bd8a-kube-api-access-5zv9d\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:03 crc kubenswrapper[4711]: I1203 13:00:03.141877 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" Dec 03 13:00:03 crc kubenswrapper[4711]: I1203 13:00:03.141883 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-rtmhd" event={"ID":"21274fc9-6952-4e33-9614-b2ecece2bd8a","Type":"ContainerDied","Data":"9de8d8e7341b7c0d20691669305a7ede4d9e84989360643eecd0454303a51b92"} Dec 03 13:00:03 crc kubenswrapper[4711]: I1203 13:00:03.142276 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9de8d8e7341b7c0d20691669305a7ede4d9e84989360643eecd0454303a51b92" Dec 03 13:00:03 crc kubenswrapper[4711]: I1203 13:00:03.545883 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8"] Dec 03 13:00:03 crc kubenswrapper[4711]: I1203 13:00:03.553085 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-shrp8"] Dec 03 13:00:03 crc kubenswrapper[4711]: I1203 13:00:03.827442 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95" path="/var/lib/kubelet/pods/a29202bb-2b0f-4a1b-b1e1-efd1eadbeb95/volumes" Dec 03 13:00:05 crc kubenswrapper[4711]: I1203 13:00:05.401229 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:00:05 crc kubenswrapper[4711]: I1203 13:00:05.401732 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:00:07 crc kubenswrapper[4711]: I1203 13:00:07.004774 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7m675" Dec 03 13:00:07 crc kubenswrapper[4711]: I1203 13:00:07.005021 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7m675" Dec 03 13:00:07 crc kubenswrapper[4711]: I1203 13:00:07.046396 4711 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7m675" Dec 03 13:00:07 crc kubenswrapper[4711]: I1203 13:00:07.227180 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7m675" Dec 03 13:00:07 crc kubenswrapper[4711]: I1203 13:00:07.318004 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7m675"] Dec 03 13:00:09 crc kubenswrapper[4711]: I1203 13:00:09.194378 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7m675" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerName="registry-server" containerID="cri-o://c5cbca05f7c61264ad0579fd3857012a1e080880744b9c2ffd667b216e6d9c22" gracePeriod=2 Dec 03 13:00:09 crc kubenswrapper[4711]: I1203 13:00:09.557559 4711 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4xzj6" Dec 03 13:00:09 crc kubenswrapper[4711]: I1203 13:00:09.678973 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4xzj6"] Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.203352 4711 generic.go:334] "Generic (PLEG): container finished" podID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerID="c5cbca05f7c61264ad0579fd3857012a1e080880744b9c2ffd667b216e6d9c22" exitCode=0 Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.203447 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m675" event={"ID":"6ab52750-c6d7-4c03-bc91-828ab04ba678","Type":"ContainerDied","Data":"c5cbca05f7c61264ad0579fd3857012a1e080880744b9c2ffd667b216e6d9c22"} Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.204538 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4xzj6" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerName="registry-server" containerID="cri-o://047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972" gracePeriod=2 Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.617603 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xzj6" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.690005 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp5rp\" (UniqueName: \"kubernetes.io/projected/e5561ca1-6740-4603-ab70-aa59f825fef6-kube-api-access-dp5rp\") pod \"e5561ca1-6740-4603-ab70-aa59f825fef6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.690101 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-utilities\") pod \"e5561ca1-6740-4603-ab70-aa59f825fef6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.690217 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-catalog-content\") pod \"e5561ca1-6740-4603-ab70-aa59f825fef6\" (UID: \"e5561ca1-6740-4603-ab70-aa59f825fef6\") " Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.691720 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-utilities" (OuterVolumeSpecName: "utilities") pod "e5561ca1-6740-4603-ab70-aa59f825fef6" (UID: "e5561ca1-6740-4603-ab70-aa59f825fef6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.696088 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5561ca1-6740-4603-ab70-aa59f825fef6-kube-api-access-dp5rp" (OuterVolumeSpecName: "kube-api-access-dp5rp") pod "e5561ca1-6740-4603-ab70-aa59f825fef6" (UID: "e5561ca1-6740-4603-ab70-aa59f825fef6"). InnerVolumeSpecName "kube-api-access-dp5rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.742665 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m675" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.748851 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5561ca1-6740-4603-ab70-aa59f825fef6" (UID: "e5561ca1-6740-4603-ab70-aa59f825fef6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.791624 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sk6r\" (UniqueName: \"kubernetes.io/projected/6ab52750-c6d7-4c03-bc91-828ab04ba678-kube-api-access-9sk6r\") pod \"6ab52750-c6d7-4c03-bc91-828ab04ba678\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.791805 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-utilities\") pod \"6ab52750-c6d7-4c03-bc91-828ab04ba678\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.791837 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-catalog-content\") pod \"6ab52750-c6d7-4c03-bc91-828ab04ba678\" (UID: \"6ab52750-c6d7-4c03-bc91-828ab04ba678\") " Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.792145 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.792163 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp5rp\" (UniqueName: \"kubernetes.io/projected/e5561ca1-6740-4603-ab70-aa59f825fef6-kube-api-access-dp5rp\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.792176 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5561ca1-6740-4603-ab70-aa59f825fef6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.792849 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-utilities" (OuterVolumeSpecName: "utilities") pod "6ab52750-c6d7-4c03-bc91-828ab04ba678" (UID: "6ab52750-c6d7-4c03-bc91-828ab04ba678"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.799381 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ab52750-c6d7-4c03-bc91-828ab04ba678-kube-api-access-9sk6r" (OuterVolumeSpecName: "kube-api-access-9sk6r") pod "6ab52750-c6d7-4c03-bc91-828ab04ba678" (UID: "6ab52750-c6d7-4c03-bc91-828ab04ba678"). InnerVolumeSpecName "kube-api-access-9sk6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.847667 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ab52750-c6d7-4c03-bc91-828ab04ba678" (UID: "6ab52750-c6d7-4c03-bc91-828ab04ba678"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.893889 4711 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.893989 4711 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ab52750-c6d7-4c03-bc91-828ab04ba678-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:10 crc kubenswrapper[4711]: I1203 13:00:10.894002 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sk6r\" (UniqueName: \"kubernetes.io/projected/6ab52750-c6d7-4c03-bc91-828ab04ba678-kube-api-access-9sk6r\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.215182 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m675" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.215188 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m675" event={"ID":"6ab52750-c6d7-4c03-bc91-828ab04ba678","Type":"ContainerDied","Data":"23c84dbe98bca2337b42a5b0935c29bf9ef67df7930ad1c0cffdd9f9eff3d82a"} Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.215623 4711 scope.go:117] "RemoveContainer" containerID="c5cbca05f7c61264ad0579fd3857012a1e080880744b9c2ffd667b216e6d9c22" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.219056 4711 generic.go:334] "Generic (PLEG): container finished" podID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerID="047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972" exitCode=0 Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.219214 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xzj6" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.219385 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xzj6" event={"ID":"e5561ca1-6740-4603-ab70-aa59f825fef6","Type":"ContainerDied","Data":"047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972"} Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.219418 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xzj6" event={"ID":"e5561ca1-6740-4603-ab70-aa59f825fef6","Type":"ContainerDied","Data":"b75f8b5c497e502eec37e142eb21970ccfbb16057243439397317a4216af78c5"} Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.234993 4711 scope.go:117] "RemoveContainer" containerID="4fae6674341c56b0f18ec655d1f8273964cc50cacbb29a422fa0baf6696e57bf" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.254255 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7m675"] Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.259526 4711 scope.go:117] "RemoveContainer" containerID="643ae2bc2629a8aec7776cba7cd95c73314e3a230ea9b6c3d1abb9f156a69656" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.266218 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7m675"] Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.274736 4711 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4xzj6"] Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.277223 4711 scope.go:117] "RemoveContainer" containerID="047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.283988 4711 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4xzj6"] Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.291856 4711 scope.go:117] "RemoveContainer" containerID="49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.312630 4711 scope.go:117] "RemoveContainer" containerID="94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.359604 4711 scope.go:117] "RemoveContainer" containerID="047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972" Dec 03 13:00:11 crc kubenswrapper[4711]: E1203 13:00:11.360163 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972\": container with ID starting with 047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972 not found: ID does not exist" containerID="047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.360214 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972"} err="failed to get container status \"047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972\": rpc error: code = NotFound desc = could not find container \"047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972\": container with ID starting with 047771bebcab29b9caee3aa2110cc0688fc392a892a965aa88b70a1773a8d972 not found: ID does not exist" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.360247 4711 scope.go:117] "RemoveContainer" containerID="49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607" Dec 03 13:00:11 crc kubenswrapper[4711]: E1203 13:00:11.360621 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607\": container with ID starting with 49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607 not found: ID does not exist" containerID="49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.360653 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607"} err="failed to get container status \"49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607\": rpc error: code = NotFound desc = could not find container \"49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607\": container with ID starting with 49abd4a2207425b25d3f58d8b139871e22d61c3e2d372714d445d2ce31620607 not found: ID does not exist" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.360676 4711 scope.go:117] "RemoveContainer" containerID="94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9" Dec 03 13:00:11 crc kubenswrapper[4711]: E1203 13:00:11.361151 4711 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9\": container with ID starting with 94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9 not found: ID does not exist" containerID="94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.361176 4711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9"} err="failed to get container status \"94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9\": rpc error: code = NotFound desc = could not find container \"94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9\": container with ID starting with 94cfafc6aee12e33c76c615700ba2db0f0f05bec4cd5efbe5a159469ccc1f3d9 not found: ID does not exist" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.824030 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" path="/var/lib/kubelet/pods/6ab52750-c6d7-4c03-bc91-828ab04ba678/volumes" Dec 03 13:00:11 crc kubenswrapper[4711]: I1203 13:00:11.824705 4711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" path="/var/lib/kubelet/pods/e5561ca1-6740-4603-ab70-aa59f825fef6/volumes" Dec 03 13:00:35 crc kubenswrapper[4711]: I1203 13:00:35.401316 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:00:35 crc kubenswrapper[4711]: I1203 13:00:35.401903 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:00:35 crc kubenswrapper[4711]: I1203 13:00:35.401965 4711 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" Dec 03 13:00:35 crc kubenswrapper[4711]: I1203 13:00:35.414984 4711 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"076734f6b97d3ea3a4f6b67e145b0fada6a5ce876a0fbb325602c953ddd0000f"} pod="openshift-machine-config-operator/machine-config-daemon-52jgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:00:35 crc kubenswrapper[4711]: I1203 13:00:35.415055 4711 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" containerID="cri-o://076734f6b97d3ea3a4f6b67e145b0fada6a5ce876a0fbb325602c953ddd0000f" gracePeriod=600 Dec 03 13:00:36 crc kubenswrapper[4711]: I1203 13:00:36.424819 4711 generic.go:334] "Generic (PLEG): container finished" podID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerID="076734f6b97d3ea3a4f6b67e145b0fada6a5ce876a0fbb325602c953ddd0000f" exitCode=0 Dec 03 13:00:36 crc kubenswrapper[4711]: I1203 13:00:36.424878 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerDied","Data":"076734f6b97d3ea3a4f6b67e145b0fada6a5ce876a0fbb325602c953ddd0000f"} Dec 03 13:00:36 crc kubenswrapper[4711]: I1203 13:00:36.425201 4711 scope.go:117] "RemoveContainer" containerID="34c4c8649e6b4de50f304749dbfcdcd32a9ef18c76884ef8de700370ecd0a6e3" Dec 03 13:00:37 crc kubenswrapper[4711]: I1203 13:00:37.434810 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" event={"ID":"776e7d35-d59b-4d4a-97cd-aec4f2441c1e","Type":"ContainerStarted","Data":"cd70f8b828e404a9dde37b613cea3b41da15380612f5abc62e6a57bdd8ff2fd7"} Dec 03 13:00:48 crc kubenswrapper[4711]: I1203 13:00:48.524923 4711 scope.go:117] "RemoveContainer" containerID="bfc6688eed32d5f0ac91702d200362720c60ef8c668a84d60496f0342e32e430" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.141107 4711 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-cron-29412781-slgkq"] Dec 03 13:01:00 crc kubenswrapper[4711]: E1203 13:01:00.142092 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142113 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4711]: E1203 13:01:00.142129 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerName="extract-content" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142137 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerName="extract-content" Dec 03 13:01:00 crc kubenswrapper[4711]: E1203 13:01:00.142154 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerName="extract-utilities" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142163 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerName="extract-utilities" Dec 03 13:01:00 crc kubenswrapper[4711]: E1203 13:01:00.142176 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142185 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4711]: E1203 13:01:00.142212 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21274fc9-6952-4e33-9614-b2ecece2bd8a" containerName="collect-profiles" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142221 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="21274fc9-6952-4e33-9614-b2ecece2bd8a" containerName="collect-profiles" Dec 03 13:01:00 crc kubenswrapper[4711]: E1203 13:01:00.142234 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerName="extract-content" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142242 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerName="extract-content" Dec 03 13:01:00 crc kubenswrapper[4711]: E1203 13:01:00.142252 4711 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerName="extract-utilities" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142260 4711 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerName="extract-utilities" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142470 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="21274fc9-6952-4e33-9614-b2ecece2bd8a" containerName="collect-profiles" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142486 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5561ca1-6740-4603-ab70-aa59f825fef6" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.142495 4711 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab52750-c6d7-4c03-bc91-828ab04ba678" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.143123 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.159772 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-cron-29412781-slgkq"] Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.234292 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-config-data\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.234350 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxqd\" (UniqueName: \"kubernetes.io/projected/c731c517-c15e-41a0-bcc5-54ad823c2a9c-kube-api-access-9bxqd\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.234472 4711 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-fernet-keys\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.335763 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-config-data\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.335824 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxqd\" (UniqueName: \"kubernetes.io/projected/c731c517-c15e-41a0-bcc5-54ad823c2a9c-kube-api-access-9bxqd\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.335942 4711 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-fernet-keys\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.342502 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-config-data\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.342562 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-fernet-keys\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.356637 4711 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxqd\" (UniqueName: \"kubernetes.io/projected/c731c517-c15e-41a0-bcc5-54ad823c2a9c-kube-api-access-9bxqd\") pod \"keystone-cron-29412781-slgkq\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.459626 4711 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:00 crc kubenswrapper[4711]: I1203 13:01:00.880012 4711 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-cron-29412781-slgkq"] Dec 03 13:01:01 crc kubenswrapper[4711]: I1203 13:01:01.610489 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" event={"ID":"c731c517-c15e-41a0-bcc5-54ad823c2a9c","Type":"ContainerStarted","Data":"cea7b51689f158b8365b60e16edb5257dbe0dfa3911759388fd9de4a8a3a5129"} Dec 03 13:01:01 crc kubenswrapper[4711]: I1203 13:01:01.610850 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" event={"ID":"c731c517-c15e-41a0-bcc5-54ad823c2a9c","Type":"ContainerStarted","Data":"8226bfe38749fa7d7a8d0658904497d919b5dba94e9fc170b8c14cd801ea0d1a"} Dec 03 13:01:01 crc kubenswrapper[4711]: I1203 13:01:01.627313 4711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" podStartSLOduration=1.627295151 podStartE2EDuration="1.627295151s" podCreationTimestamp="2025-12-03 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:01:01.624996899 +0000 UTC m=+2780.294248174" watchObservedRunningTime="2025-12-03 13:01:01.627295151 +0000 UTC m=+2780.296546406" Dec 03 13:01:03 crc kubenswrapper[4711]: I1203 13:01:03.624008 4711 generic.go:334] "Generic (PLEG): container finished" podID="c731c517-c15e-41a0-bcc5-54ad823c2a9c" containerID="cea7b51689f158b8365b60e16edb5257dbe0dfa3911759388fd9de4a8a3a5129" exitCode=0 Dec 03 13:01:03 crc kubenswrapper[4711]: I1203 13:01:03.624041 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" event={"ID":"c731c517-c15e-41a0-bcc5-54ad823c2a9c","Type":"ContainerDied","Data":"cea7b51689f158b8365b60e16edb5257dbe0dfa3911759388fd9de4a8a3a5129"} Dec 03 13:01:04 crc kubenswrapper[4711]: I1203 13:01:04.948149 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.110637 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-config-data\") pod \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.110708 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-fernet-keys\") pod \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.110751 4711 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bxqd\" (UniqueName: \"kubernetes.io/projected/c731c517-c15e-41a0-bcc5-54ad823c2a9c-kube-api-access-9bxqd\") pod \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\" (UID: \"c731c517-c15e-41a0-bcc5-54ad823c2a9c\") " Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.116661 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c731c517-c15e-41a0-bcc5-54ad823c2a9c" (UID: "c731c517-c15e-41a0-bcc5-54ad823c2a9c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.117040 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c731c517-c15e-41a0-bcc5-54ad823c2a9c-kube-api-access-9bxqd" (OuterVolumeSpecName: "kube-api-access-9bxqd") pod "c731c517-c15e-41a0-bcc5-54ad823c2a9c" (UID: "c731c517-c15e-41a0-bcc5-54ad823c2a9c"). InnerVolumeSpecName "kube-api-access-9bxqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.151019 4711 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-config-data" (OuterVolumeSpecName: "config-data") pod "c731c517-c15e-41a0-bcc5-54ad823c2a9c" (UID: "c731c517-c15e-41a0-bcc5-54ad823c2a9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.212068 4711 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.212108 4711 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c731c517-c15e-41a0-bcc5-54ad823c2a9c-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.212119 4711 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bxqd\" (UniqueName: \"kubernetes.io/projected/c731c517-c15e-41a0-bcc5-54ad823c2a9c-kube-api-access-9bxqd\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.640172 4711 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" event={"ID":"c731c517-c15e-41a0-bcc5-54ad823c2a9c","Type":"ContainerDied","Data":"8226bfe38749fa7d7a8d0658904497d919b5dba94e9fc170b8c14cd801ea0d1a"} Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.640221 4711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8226bfe38749fa7d7a8d0658904497d919b5dba94e9fc170b8c14cd801ea0d1a" Dec 03 13:01:05 crc kubenswrapper[4711]: I1203 13:01:05.640246 4711 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29412781-slgkq" Dec 03 13:03:05 crc kubenswrapper[4711]: I1203 13:03:05.401581 4711 patch_prober.go:28] interesting pod/machine-config-daemon-52jgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:03:05 crc kubenswrapper[4711]: I1203 13:03:05.402221 4711 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-52jgg" podUID="776e7d35-d59b-4d4a-97cd-aec4f2441c1e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"